WorldWideScience

Sample records for computer adaptive testing

  1. Test Anxiety, Computer-Adaptive Testing and the Common Core

    Science.gov (United States)

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  2. Simple and Effective Algorithms: Computer-Adaptive Testing.

    Science.gov (United States)

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  3. Computer-Adaptive Testing: Implications for Students' Achievement, Motivation, Engagement, and Subjective Test Experience

    Science.gov (United States)

    Martin, Andrew J.; Lazendic, Goran

    2018-01-01

    The present study investigated the implications of computer-adaptive testing (operationalized by way of multistage adaptive testing; MAT) and "conventional" fixed order computer testing for various test-relevant outcomes in numeracy, including achievement, test-relevant motivation and engagement, and subjective test experience. It did so…

  4. Computer-Adaptive Testing in Second Language Contexts.

    Science.gov (United States)

    Chalhoub-Deville, Micheline; Deville, Craig

    1999-01-01

    Provides a broad overview of computerized testing issues with an emphasis on computer-adaptive testing (CAT). A survey of the potential benefits and drawbacks of CAT are given, the process of CAT development is described; and some L2 instruments developed to assess various language skills are summarized. (Author/VWL)

  5. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    Science.gov (United States)

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  6. Computer Adaptive Multistage Testing: Practical Issues, Challenges and Principles

    Directory of Open Access Journals (Sweden)

    Halil Ibrahim SARI

    2016-12-01

    Full Text Available The purpose of many test in the educational and psychological measurement is to measure test takers’ latent trait scores from responses given to a set of items. Over the years, this has been done by traditional methods (paper and pencil tests. However, compared to other test administration models (e.g., adaptive testing, traditional methods are extensively criticized in terms of producing low measurement accuracy and long test length. Adaptive testing has been proposed to overcome these problems. There are two popular adaptive testing approaches. These are computerized adaptive testing (CAT and computer adaptive multistage testing (ca-MST. The former is a well-known approach that has been predominantly used in this field. We believe that researchers and practitioners are fairly familiar with many aspects of CAT because it has more than a hundred years of history. However, the same thing is not true for the latter one. Since ca-MST is relatively new, many researchers are not familiar with features of it. The purpose of this study is to closely examine the characteristics of ca-MST, including its working principle, the adaptation procedure called the routing method, test assembly, and scoring, and provide an overview to researchers, with the aim of drawing researchers’ attention to ca-MST and encouraging them to contribute to the research in this area. The books, software and future work for ca-MST are also discussed.

  7. Improving personality facet scores with multidimensional computer adaptive testing

    DEFF Research Database (Denmark)

    Makransky, Guido; Mortensen, Erik Lykke; Glas, Cees A W

    2013-01-01

    personality tests contain many highly correlated facets. This article investigates the possibility of increasing the precision of the NEO PI-R facet scores by scoring items with multidimensional item response theory and by efficiently administering and scoring items with multidimensional computer adaptive...

  8. The impacts of computer adaptive testing from a variety of perspectives

    Directory of Open Access Journals (Sweden)

    Tetsuo Kimura

    2017-05-01

    Full Text Available Computer adaptive testing (CAT is a kind of tailored testing, in that it is a form of computer-based testing that is adaptive to each test-taker’s ability level. In this review, the impacts of CAT are discussed from different perspectives in order to illustrate crucial points to keep in mind during the development and implementation of CAT. Test developers and psychometricians often emphasize the efficiency and accuracy of CAT in comparison to traditional linear tests. However, many test-takers report feeling discouraged after taking CATs, and this feeling can reduce learning self-efficacy and motivation. A trade-off must be made between the psychological experiences of test-takers and measurement efficiency. From the perspective of educators and subject matter experts, nonstatistical specifications, such as content coverage, content balance, and form length are major concerns. Thus, accreditation bodies may be faced with a discrepancy between the perspectives of psychometricians and those of subject matter experts. In order to improve test-takers’ impressions of CAT, the author proposes increasing the target probability of answering correctly in the item selection algorithm even if doing so consequently decreases measurement efficiency. Two different methods, CAT with a shadow test approach and computerized multistage testing, have been developed in order to ensure the satisfaction of subject matter experts. In the shadow test approach, a full-length test is assembled that meets the constraints and provides maximum information at the current ability estimate, while computerized multistage testing gives subject matter experts an opportunity to review all test forms prior to administration.

  9. Test Information Targeting Strategies for Adaptive Multistage Testing Designs.

    Science.gov (United States)

    Luecht, Richard M.; Burgin, William

    Adaptive multistage testlet (MST) designs appear to be gaining popularity for many large-scale computer-based testing programs. These adaptive MST designs use a modularized configuration of preconstructed testlets and embedded score-routing schemes to prepackage different forms of an adaptive test. The conditional information targeting (CIT)…

  10. Computerized adaptive testing in computer assisted learning?

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick

    2011-01-01

    A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and

  11. Applying computer adaptive testing to optimize online assessment of suicidal behavior: a simulation study.

    NARCIS (Netherlands)

    de Beurs, D.P.; de Vries, A.L.M.; de Groot, M.H.; de Keijser, J.; Kerkhof, A.J.F.M.

    2014-01-01

    Background: The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce

  12. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  13. Implementation of an Improved Adaptive Testing Theory

    Science.gov (United States)

    Al-A'ali, Mansoor

    2007-01-01

    Computer adaptive testing is the study of scoring tests and questions based on assumptions concerning the mathematical relationship between examinees' ability and the examinees' responses. Adaptive student tests, which are based on item response theory (IRT), have many advantages over conventional tests. We use the least square method, a…

  14. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    Directory of Open Access Journals (Sweden)

    Jack Burston

    2014-09-01

    Full Text Available This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2 English ability of incoming students is by means of a computer-based test since such evaluation can be administered quickly, automatically corrected, and the outcome known as soon as the test is completed. While the option of using a commercial CAT is available to institutions with the ability to pay substantial annual fees, or the means of passing these expenses on to their students, language instructors without these resources can only avail themselves of the advantages of CAT evaluation by creating their own tests.  As is demonstrated by the E-CAT project described in this paper, this is a viable alternative even for those lacking any computer programing expertise.  However, language teaching experience and testing expertise are critical to such an undertaking, which requires considerable effort and, above all, collaborative teamwork to succeed. A number of practical skills are also required. Firstly, the operation of a CAT authoring programme must be learned. Once this is done, test makers must master the art of creating a question database and assigning difficulty levels to test items. Lastly, if multimedia resources are to be exploited in a CAT, test creators need to be able to locate suitable copyright-free resources and re-edit them as needed.

  15. Computer adaptive test performance in children with and without disabilities: prospective field study of the PEDI-CAT

    NARCIS (Netherlands)

    Dumas, H.M.; Fragala-Pinkham, M.A.; Haley, S.M.; Ni, P.; Coster, W.; Kramer, J.M.; Kao, Y.C.; Moed, R.; Ludlow, L.H.

    2012-01-01

    PURPOSE: To examine the discriminant validity, test-retest reliability, administration time and acceptability of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT). METHODS: A sample of 102 parents of children 3 through 20 years of age with (n = 50) and without (n =

  16. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    Science.gov (United States)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  17. Adjusting for cross-cultural differences in computer-adaptive tests of quality of life.

    Science.gov (United States)

    Gibbons, C J; Skevington, S M

    2018-04-01

    Previous studies using the WHOQOL measures have demonstrated that the relationship between individual items and the underlying quality of life (QoL) construct may differ between cultures. If unaccounted for, these differing relationships can lead to measurement bias which, in turn, can undermine the reliability of results. We used item response theory (IRT) to assess differential item functioning (DIF) in WHOQOL data from diverse language versions collected in UK, Zimbabwe, Russia, and India (total N = 1332). Data were fitted to the partial credit 'Rasch' model. We used four item banks previously derived from the WHOQOL-100 measure, which provided excellent measurement for physical, psychological, social, and environmental quality of life domains (40 items overall). Cross-cultural differential item functioning was assessed using analysis of variance for item residuals and post hoc Tukey tests. Simulated computer-adaptive tests (CATs) were conducted to assess the efficiency and precision of the four items banks. Splitting item parameters by DIF results in four linked item banks without DIF or other breaches of IRT model assumptions. Simulated CATs were more precise and efficient than longer paper-based alternatives. Assessing differential item functioning using item response theory can identify measurement invariance between cultures which, if uncontrolled, may undermine accurate comparisons in computer-adaptive testing assessments of QoL. We demonstrate how compensating for DIF using item anchoring allowed data from all four countries to be compared on a common metric, thus facilitating assessments which were both sensitive to cultural nuance and comparable between countries.

  18. Computer controlled testing of batteries

    NARCIS (Netherlands)

    Kuiper, A.C.J.; Einerhand, R.E.F.; Visscher, W.

    1989-01-01

    A computerized testing device for batteries consists of a power supply, a multiplexer circuit connected to the batteries, a protection circuit, and an IBM Data Aquisition and Control Adapter card, connected to a personal computer. The software is written in Turbo-Pascal and can be easily adapted to

  19. Validity of Cognitive ability tests – comparison of computerized adaptive testing with paper and pencil and computer-based forms of administrations

    Czech Academy of Sciences Publication Activity Database

    Žitný, P.; Halama, P.; Jelínek, Martin; Květon, Petr

    2012-01-01

    Roč. 54, č. 3 (2012), s. 181-194 ISSN 0039-3320 R&D Projects: GA ČR GP406/09/P284 Institutional support: RVO:68081740 Keywords : item response theory * computerized adaptive testing * paper and pencil * computer-based * criterion and construct validity * efficiency Subject RIV: AN - Psychology Impact factor: 0.215, year: 2012

  20. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Aaronson, Neil K; Arraras, Juan I

    2013-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF...

  1. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Constantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; Verdonck-de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  2. The EORTC computer-adaptive tests measuring physical functioning and fatigue exhibited high levels of measurement precision and efficiency

    NARCIS (Netherlands)

    Petersen, M.A.; Aaronson, N.K.; Arraras, J.I.; Chie, W.C.; Conroy, T.; Costantini, A.; Giesinger, J.M.; Holzner, B.; King, M.T.; Singer, S.; Velikova, G.; de Leeuw, I.M.; Young, T.; Groenvold, M.

    2013-01-01

    Objectives: The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is developing a computer-adaptive test (CAT) version of the EORTC Quality of Life Questionnaire (QLQ-C30). We evaluated the measurement properties of the CAT versions of physical functioning (PF)

  3. Study maps as a tool for the adaptive tests construction

    Directory of Open Access Journals (Sweden)

    Dita Dlabolová

    2013-01-01

    Full Text Available Measurement of students’ knowledge is an essential part of the educational process. Teachers on universities often use computer-based tests to testing a large number of students in a short time. The question is, what kind of information these tests provide, and if it is possible to classify students on this basis. Praxis shows that the scalar test results in the form of simple numbers cannot be plainly interpreted as the level of knowledge; moreover it is not easy to build such tests, which detect the necessary information. In the first part of the article we present the results of pedagogical experiment focused on the difference between information obtained through the computer-based test and a teacher’s interview with the same students. Possible starting point to improve information from computer-based tests in non-scalar form is a construction of an adaptive test, adapting test items to identify knowledge similar to a conversation with a teacher. As a tool for the design of the adaptive tests we use so called study maps, which are described in the second part of the article.

  4. Development of a Postacute Hospital Item Bank for the New Pediatric Evaluation of Disability Inventory-Computer Adaptive Test

    Science.gov (United States)

    Dumas, Helene M.

    2010-01-01

    The PEDI-CAT is a new computer adaptive test (CAT) version of the Pediatric Evaluation of Disability Inventory (PEDI). Additional PEDI-CAT items specific to postacute pediatric hospital care were recently developed using expert reviews and cognitive interviewing techniques. Expert reviews established face and construct validity, providing positive…

  5. Computers in Language Testing: Present Research and Some Future Directions.

    Science.gov (United States)

    Brown, James Dean

    1997-01-01

    Explores recent developments in the use of computers in language testing in four areas: (1) item banking; (2) computer-assisted language testing; (3) computerized-adaptive language testing; and (4) research on the effectiveness of computers in language testing. Examines educational measurement literature in an attempt to forecast the directions…

  6. Adaptation of HAMMER computer code to CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Pinheiro, A.M.B.S.; Nair, R.P.K.

    1982-01-01

    The adaptation of HAMMER computer code to CYBER 170/750 computer is presented. The HAMMER code calculates cell parameters by multigroup transport theory and reactor parameters by few group diffusion theory. The auxiliary programs, the carried out modifications and the use of HAMMER system adapted to CYBER 170/750 computer are described. (M.C.K.) [pt

  7. Bayesian item selection criteria for adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1996-01-01

    R.J. Owen (1975) proposed an approximate empirical Bayes procedure for item selection in adaptive testing. The procedure replaces the true posterior by a normal approximation with closed-form expressions for its first two moments. This approximation was necessary to minimize the computational

  8. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    Science.gov (United States)

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  9. Adaptability of supercomputers to nuclear computations

    International Nuclear Information System (INIS)

    Asai, Kiyoshi; Ishiguro, Misako; Matsuura, Toshihiko.

    1983-01-01

    Recently in the field of scientific and technical calculation, the usefulness of supercomputers represented by CRAY-1 has been recognized, and they are utilized in various countries. The rapid computation of supercomputers is based on the function of vector computation. The authors investigated the adaptability to vector computation of about 40 typical atomic energy codes for the past six years. Based on the results of investigation, the adaptability of the function of vector computation that supercomputers have to atomic energy codes, the problem regarding the utilization and the future prospect are explained. The adaptability of individual calculation codes to vector computation is largely dependent on the algorithm and program structure used for the codes. The change to high speed by pipeline vector system, the investigation in the Japan Atomic Energy Research Institute and the results, and the examples of expressing the codes for atomic energy, environmental safety and nuclear fusion by vector are reported. The magnification of speed up for 40 examples was from 1.5 to 9.0. It can be said that the adaptability of supercomputers to atomic energy codes is fairly good. (Kako, I.)

  10. Hybrid GPU-CPU adaptive precision ray-triangle intersection tests for robust high-performance GPU dosimetry computations

    International Nuclear Information System (INIS)

    Perrotte, Lancelot; Bodin, Bruno; Chodorge, Laurent

    2011-01-01

    Before an intervention on a nuclear site, it is essential to study different scenarios to identify the less dangerous one for the operator. Therefore, it is mandatory to dispose of an efficient dosimetry simulation code with accurate results. One classical method in radiation protection is the straight-line attenuation method with build-up factors. In the case of 3D industrial scenes composed of meshes, the computation cost resides in the fast computation of all of the intersections between the rays and the triangles of the scene. Efficient GPU algorithms have already been proposed, that enable dosimetry calculation for a huge scene (800000 rays, 800000 triangles) in a fraction of second. But these algorithms are not robust: because of the rounding caused by floating-point arithmetic, the numerical results of the ray-triangle intersection tests can differ from the expected mathematical results. In worst case scenario, this can lead to a computed dose rate dramatically inferior to the real dose rate to which the operator is exposed. In this paper, we present a hybrid GPU-CPU algorithm to manage adaptive precision floating-point arithmetic. This algorithm allows robust ray-triangle intersection tests, with very small loss of performance (less than 5 % overhead), and without any need for scene-dependent tuning. (author)

  11. An Adaptive Test Sheet Generation Mechanism Using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Huan-Yu Lin

    2012-01-01

    Full Text Available For test-sheet composition systems, it is important to adaptively compose test sheets with diverse conceptual scopes, discrimination and difficulty degrees to meet various assessment requirements during real learning situations. Computation time and item exposure rate also influence performance and item bank security. Therefore, this study proposes an Adaptive Test Sheet Generation (ATSG mechanism, where a Candidate Item Selection Strategy adaptively determines candidate test items and conceptual granularities according to desired conceptual scopes, and an Aggregate Objective Function applies Genetic Algorithm (GA to figure out the approximate solution of mixed integer programming problem for the test-sheet composition. Experimental results show that the ATSG mechanism can efficiently, precisely generate test sheets to meet the various assessment requirements than existing ones. Furthermore, according to experimental finding, Fractal Time Series approach can be applied to analyze the self-similarity characteristics of GA’s fitness scores for improving the quality of the test-sheet composition in the near future.

  12. Towards psychologically adaptive brain-computer interfaces

    Science.gov (United States)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  13. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    Science.gov (United States)

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment. (c) 2015 APA, all rights reserved).

  14. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  15. On the issue of item selection in computerized adaptive testing with response times

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2016-01-01

    Many standardized tests are now administered via computer rather than paper-and-pencil format. The computer-based delivery mode brings with it certain advantages. One advantage is the ability to adapt the difficulty level of the test to the ability level of the test taker in what has been termed

  16. Extended shadow test approach for constrained adaptive testing

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Ariel, A.

    2002-01-01

    Several methods have been developed for use on constrained adaptive testing. Item pool partitioning, multistage testing, and testlet-based adaptive testing are methods that perform well for specific cases of adaptive testing. The weighted deviation model and the Shadow Test approach can be more

  17. Development of an item bank for the EORTC Role Functioning Computer Adaptive Test (EORTC RF-CAT)

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Petersen, Morten Aa.; Aaronson, Neil

    2016-01-01

    a computer-adaptive test (CAT) for RF. This was part of a larger project whose objective is to develop a CAT version of the EORTC QLQ-C30 which is one of the most widely used HRQOL instruments in oncology. METHODS: In accordance with EORTC guidelines, the development of the RF-CAT comprised four phases...... with good psychometric properties. The resulting item bank exhibits excellent reliability (mean reliability = 0.85, median = 0.95). Using the RF-CAT may allow sample size savings from 11 % up to 50 % compared to using the QLQ-C30 RF scale. CONCLUSIONS: The RF-CAT item bank improves the precision...

  18. Adaptive test

    DEFF Research Database (Denmark)

    Kjeldsen, Lars Peter; Eriksen, Mette Rose

    2010-01-01

    Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale.......Artikelen er en evaluering af de adaptive tests, som blev indført i folkeskolen. Artiklen sætter særligt fokus på evaluering i folkeskolen, herunder bidrager den med vejledning til evaluering, evalueringsværktøjer og fagspecifkt evalueringsmateriale....

  19. Statistical Indexes for Monitoring Item Behavior under Computer Adaptive Testing Environment.

    Science.gov (United States)

    Zhu, Renbang; Yu, Feng; Liu, Su

    A computerized adaptive test (CAT) administration usually requires a large supply of items with accurately estimated psychometric properties, such as item response theory (IRT) parameter estimates, to ensure the precision of examinee ability estimation. However, an estimated IRT model of a given item in any given pool does not always correctly…

  20. Redefining diagnostic symptoms of depression using Rasch analysis: testing an item bank suitable for DSM-V and computer adaptive testing.

    Science.gov (United States)

    Mitchell, Alex J; Smith, Adam B; Al-salihy, Zerak; Rahim, Twana A; Mahmud, Mahmud Q; Muhyaldin, Asma S

    2011-10-01

    We aimed to redefine the optimal self-report symptoms of depression suitable for creation of an item bank that could be used in computer adaptive testing or to develop a simplified screening tool for DSM-V. Four hundred subjects (200 patients with primary depression and 200 non-depressed subjects), living in Iraqi Kurdistan were interviewed. The Mini International Neuropsychiatric Interview (MINI) was used to define the presence of major depression (DSM-IV criteria). We examined symptoms of depression using four well-known scales delivered in Kurdish. The Partial Credit Model was applied to each instrument. Common-item equating was subsequently used to create an item bank and differential item functioning (DIF) explored for known subgroups. A symptom level Rasch analysis reduced the original 45 items to 24 items of the original after the exclusion of 21 misfitting items. A further six items (CESD13 and CESD17, HADS-D4, HADS-D5 and HADS-D7, and CDSS3 and CDSS4) were removed due to misfit as the items were added together to form the item bank, and two items were subsequently removed following the DIF analysis by diagnosis (CESD20 and CDSS9, both of which were harder to endorse for women). Therefore the remaining optimal item bank consisted of 17 items and produced an area under the curve (AUC) of 0.987. Using a bank restricted to the optimal nine items revealed only minor loss of accuracy (AUC = 0.989, sensitivity 96%, specificity 95%). Finally, when restricted to only four items accuracy was still high (AUC was still 0.976; sensitivity 93%, specificity 96%). An item bank of 17 items may be useful in computer adaptive testing and nine or even four items may be used to develop a simplified screening tool for DSM-V major depressive disorder (MDD). Further examination of this item bank should be conducted in different cultural settings.

  1. Adapting the Freiburg monosyllabic word test for Slovenian

    Directory of Open Access Journals (Sweden)

    Tatjana Marvin

    2017-12-01

    Full Text Available Speech audiometry is one of the standard methods used to diagnose the type of hearing loss and to assess the communication function of the patient by determining the level of the patient’s ability to understand and repeat words presented to him or her in a hearing test. For this purpose, the Slovenian adaptation of the German tests developed by Hahlbrock (1953, 1960 – the Freiburg Monosyllabic Word Test and the Freiburg Number Test – are used in Slovenia (adapted in 1968 by Pompe. In this paper we focus on the Freiburg Monosyllabic Word Test for Slovenian, which has been criticized by patients as well as in the literature for the unequal difficulty and frequency of the words, with many of these being extremely rare or even obsolete. As part of the patient’s communication function is retrieving the meaning of individual words by guessing, the less frequent and consequently less familiar words do not contribute to reliable testing results. We therefore adapt the test by identifying and removing such words and supplement them with phonetically similar words to preserve the phonetic balance of the list. The words used for replacement are extracted from the written corpus of Slovenian Gigafida and the spoken corpus of Slovenian GOS, while the optimal combinations of words are established by using computational algorithms.

  2. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang

    2010-10-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting the changes in a grid system can help to alarm the anomalies, clean the noises, and report the new patterns. In this paper, we proposed an approach of self-adaptive change detection based on the Page-Hinkley statistic test. It handles the non-stationary distribution without the assumption of data distribution and the empirical setting of parameters. We validate the approach on the EGEE streaming jobs, and report its better performance on achieving higher accuracy comparing to the other change detection methods. Meanwhile this change detection process could help to discover the device fault which was not claimed in the system logs. © 2010 IEEE.

  3. Validation of a computer-adaptive test to evaluate generic health-related quality of life

    Directory of Open Access Journals (Sweden)

    Zardaín Pilar C

    2010-12-01

    Full Text Available Abstract Background Health Related Quality of Life (HRQoL is a relevant variable in the evaluation of health outcomes. Questionnaires based on Classical Test Theory typically require a large number of items to evaluate HRQoL. Computer Adaptive Testing (CAT can be used to reduce tests length while maintaining and, in some cases, improving accuracy. This study aimed at validating a CAT based on Item Response Theory (IRT for evaluation of generic HRQoL: the CAT-Health instrument. Methods Cross-sectional study of subjects aged over 18 attending Primary Care Centres for any reason. CAT-Health was administered along with the SF-12 Health Survey. Age, gender and a checklist of chronic conditions were also collected. CAT-Health was evaluated considering: 1 feasibility: completion time and test length; 2 content range coverage, Item Exposure Rate (IER and test precision; and 3 construct validity: differences in the CAT-Health scores according to clinical variables and correlations between both questionnaires. Results 396 subjects answered CAT-Health and SF-12, 67.2% females, mean age (SD 48.6 (17.7 years. 36.9% did not report any chronic condition. Median completion time for CAT-Health was 81 seconds (IQ range = 59-118 and it increased with age (p Conclusions Although domain-specific CATs exist for various areas of HRQoL, CAT-Health is one of the first IRT-based CATs designed to evaluate generic HRQoL and it has proven feasible, valid and efficient, when administered to a broad sample of individuals attending primary care settings.

  4. Quinoa - Adaptive Computational Fluid Dynamics, 0.2

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter, an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh

  5. Intricacies of Feedback in Computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    Prism Adaptation Therapy (PAT) is an intervention method for treatment of attentional disorders, such as neglect e.g. 1,2. The method involves repeated pointing at specified targets with or without prism glasses using a specifically designed wooden box. The aim of this study was to ascertain...... whether the PAT method can be executed with similar effect using a computer with a touch screen.   62 healthy subjects were subjected to two experimental conditions: 1) pointing out at targets using the original box, 2) pointing out at targets on a computer attached touch screen. In both conditions......, the subjects performed a pre-test consisting of 30 targets without feedback, then an exposure-test of 90 targets with prism glasses and feedback, and finally a post-test of 60 targets, with no glasses and no feedback. Two experiments were carried out, 1) the feedback was provided by showing a cross...

  6. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  8. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees.

    Science.gov (United States)

    Chien, Tsair-Wei; Lai, Wen-Pin; Lu, Chih-Wei; Wang, Weng-Chung; Chen, Shih-Chung; Wang, Hsien-Yi; Su, Shih-Bin

    2011-04-17

    To develop a web-based computer adaptive testing (CAT) application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37) could evaluate the job satisfaction of individual employees as a single construct. The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  9. Web-based computer adaptive assessment of individual perceptions of job satisfaction for hospital workplace employees

    Directory of Open Access Journals (Sweden)

    Chen Shih-Chung

    2011-04-01

    Full Text Available Abstract Background To develop a web-based computer adaptive testing (CAT application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37 could evaluate the job satisfaction of individual employees as a single construct. Methods The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Results Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. Conclusions A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content.

  10. WEB-BASED ADAPTIVE TESTING SYSTEM (WATS FOR CLASSIFYING STUDENTS ACADEMIC ABILITY

    Directory of Open Access Journals (Sweden)

    Jaemu LEE,

    2012-08-01

    Full Text Available Computer Adaptive Testing (CAT has been highlighted as a promising assessment method to fulfill two testing purposes: estimating student academic ability and classifying student academic level. In this paper, we introduced the Web-based Adaptive Testing System (WATS developed to support a cost effective assessment for classifying students’ ability into different academic levels. Instead of using a traditional paper and pencil test, the WATS is expected to serve as an alternate method to promptly diagnosis and identify underachieving students through Web-based testing. The WATS can also help provide students with appropriate learning contents and necessary academic support in time. In this paper, theoretical background and structure of WATS, item construction process based upon item response theory, and user interfaces of WATS were discussed.

  11. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    Science.gov (United States)

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  12. Discrete linear canonical transform computation by adaptive method.

    Science.gov (United States)

    Zhang, Feng; Tao, Ran; Wang, Yue

    2013-07-29

    The linear canonical transform (LCT) describes the effect of quadratic phase systems on a wavefield and generalizes many optical transforms. In this paper, the computation method for the discrete LCT using the adaptive least-mean-square (LMS) algorithm is presented. The computation approaches of the block-based discrete LCT and the stream-based discrete LCT using the LMS algorithm are derived, and the implementation structures of these approaches by the adaptive filter system are considered. The proposed computation approaches have the inherent parallel structures which make them suitable for efficient VLSI implementations, and are robust to the propagation of possible errors in the computation process.

  13. Wavefront measurement using computational adaptive optics.

    Science.gov (United States)

    South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A

    2018-03-01

    In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.

  14. Unauthorised adaptation of computer programmes - is criminalisation a solution?

    Directory of Open Access Journals (Sweden)

    L Muswaka

    2011-12-01

    Full Text Available In Haupt t/a Softcopy v Brewers Marketing Intelligence (Pty Ltd 2006 4 SA 458 (SCA Haupt sought to enforce a copyright claim in the Data Explorer computer programme against Brewers Marketing Intelligence (Pty Ltd. His claim was dismissed in the High Court and he appealed to the Supreme Court of Appeal. The Court held that copyright in the Data Explorer programme vested in Haupt. Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty Ltd.This case note inter alia analyses the possibility of an author being sued for infringement even though he has acquired copyright in a work that he created by making unauthorised adaptations to another's copyright material. Furthermore, it examines whether or not the law adequately protects copyright owners in situations where infringement takes the form of unauthorised adaptations of computer programmes. It is argued that the protection afforded by the Copyright Act 98 of 1978 (Copyright Act in terms of section 27(1 to copyright owners of computer programmes is narrowly defined. It excludes from its ambit of criminal liability the act of making unauthorised adaptation of computer programmes. The issue that is considered is therefore whether or not the unauthorised adaptation of computer programmes should attract a criminal sanction. In addressing this issue and with the aim of making recommendations, the legal position in the United Kingdom (UK is analysed. From the analysis it is recommended that the Copyright Act be amended by the insertion of a new section, section 27(1(A, which will make the act of making an unauthorised adaptation of a computer programme an offence. This recommended section will close the gap that currently exists in our law with regard to unauthorised adaptations of computer programmes.

  15. Adaptation and hybridization in computational intelligence

    CERN Document Server

    Jr, Iztok

    2015-01-01

      This carefully edited book takes a walk through recent advances in adaptation and hybridization in the Computational Intelligence (CI) domain. It consists of ten chapters that are divided into three parts. The first part illustrates background information and provides some theoretical foundation tackling the CI domain, the second part deals with the adaptation in CI algorithms, while the third part focuses on the hybridization in CI. This book can serve as an ideal reference for researchers and students of computer science, electrical and civil engineering, economy, and natural sciences that are confronted with solving the optimization, modeling and simulation problems. It covers the recent advances in CI that encompass Nature-inspired algorithms, like Artificial Neural networks, Evolutionary Algorithms and Swarm Intelligence –based algorithms.  

  16. Modifications of the branch-and-bound algorithm for application in constrained adaptive testing

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2000-01-01

    A mathematical programming approach is presented for computer adaptive testing (CAT) with many constraints on the item and test attributes. Because mathematical programming problems have to be solved while the examinee waits for the next item, a fast implementation of the Branch-and-Bound algorithm

  17. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  18. An authoring tool for building both mobile adaptable tests and web-based adaptive or classic tests

    NARCIS (Netherlands)

    Romero, C.; Ventura, S.; Hervás, C.; De Bra, P.M.E.; Wade, V.; Ashman, H.; Smyth, B.

    2006-01-01

    This paper describes Test Editor, an authoring tool for building both mobile adaptable tests and web-based adaptive or classic tests. This tool facilitates the development and maintenance of different types of XML-based multiple- choice tests for using in web-based education systems and wireless

  19. Numerical thermal mathematical model correlation to thermal balance test using adaptive particle swarm optimization (APSO)

    International Nuclear Information System (INIS)

    Beck, T.; Bieler, A.; Thomas, N.

    2012-01-01

    We present structural and thermal model (STM) tests of the BepiColombo laser altimeter (BELA) receiver baffle with emphasis on the correlation of the data with a thermal mathematical model. The test unit is a part of the thermal and optical protection of the BELA instrument being tested under infrared and solar irradiation at University of Bern. An iterative optimization method known as particle swarm optimization has been adapted to adjust the model parameters, mainly the linear conductivity, in such a way that model and test results match. The thermal model reproduces the thermal tests to an accuracy of 4.2 °C ± 3.2 °C in a temperature range of 200 °C after using only 600 iteration steps of the correlation algorithm. The use of this method brings major benefits to the accuracy of the results as well as to the computational time required for the correlation. - Highlights: ► We present model correlations of the BELA receiver baffle to thermal balance tests. ► Adaptive particle swarm optimization has been adapted for the correlation. ► The method improves the accuracy of the correlation and the computational time.

  20. On-Line Testing and Reconfiguration of Field Programmable Gate Arrays (FPGAs) for Fault-Tolerant (FT) Applications in Adaptive Computing Systems (ACS)

    National Research Council Canada - National Science Library

    Abramovici, Miron

    2002-01-01

    Adaptive computing systems (ACS) rely on reconfigurable hardware to adapt the system operation to changes in the external environment, and to extend mission capability by implementing new functions on the same hardware platform...

  1. Features of the adaptive control and measuring the effectiveness of distant teaching to computer science

    Directory of Open Access Journals (Sweden)

    Евгений Игоревич Горюшкин

    2009-06-01

    Full Text Available In title approaches to construction of effective monitoring systems of productivity of training to computer science in high schools are described. It is offered to put adaptive testing at which in development of tests artificial neural networks are applied in a basis of such systems.

  2. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  3. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  4. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  5. Overview of adaptive finite element analysis in computational geodynamics

    Science.gov (United States)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  6. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  7. Parallel Adaptive Mesh Refinement for High-Order Finite-Volume Schemes in Computational Fluid Dynamics

    Science.gov (United States)

    Schwing, Alan Michael

    For computational fluid dynamics, the governing equations are solved on a discretized domain of nodes, faces, and cells. The quality of the grid or mesh can be a driving source for error in the results. While refinement studies can help guide the creation of a mesh, grid quality is largely determined by user expertise and understanding of the flow physics. Adaptive mesh refinement is a technique for enriching the mesh during a simulation based on metrics for error, impact on important parameters, or location of important flow features. This can offload from the user some of the difficult and ambiguous decisions necessary when discretizing the domain. This work explores the implementation of adaptive mesh refinement in an implicit, unstructured, finite-volume solver. Consideration is made for applying modern computational techniques in the presence of hanging nodes and refined cells. The approach is developed to be independent of the flow solver in order to provide a path for augmenting existing codes. It is designed to be applicable for unsteady simulations and refinement and coarsening of the grid does not impact the conservatism of the underlying numerics. The effect on high-order numerical fluxes of fourth- and sixth-order are explored. Provided the criteria for refinement is appropriately selected, solutions obtained using adapted meshes have no additional error when compared to results obtained on traditional, unadapted meshes. In order to leverage large-scale computational resources common today, the methods are parallelized using MPI. Parallel performance is considered for several test problems in order to assess scalability of both adapted and unadapted grids. Dynamic repartitioning of the mesh during refinement is crucial for load balancing an evolving grid. Development of the methods outlined here depend on a dual-memory approach that is described in detail. Validation of the solver developed here against a number of motivating problems shows favorable

  8. Testlet-Based Multidimensional Adaptive Testing.

    Science.gov (United States)

    Frey, Andreas; Seitz, Nicki-Nils; Brandt, Steffen

    2016-01-01

    Multidimensional adaptive testing (MAT) is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT). MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, and 1.5) and testlet sizes (3, 6, and 9 items) with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  9. Testlet-based Multidimensional Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Andreas Frey

    2016-11-01

    Full Text Available Multidimensional adaptive testing (MAT is a highly efficient method for the simultaneous measurement of several latent traits. Currently, no psychometrically sound approach is available for the use of MAT in testlet-based tests. Testlets are sets of items sharing a common stimulus such as a graph or a text. They are frequently used in large operational testing programs like TOEFL, PISA, PIRLS, or NAEP. To make MAT accessible for such testing programs, we present a novel combination of MAT with a multidimensional generalization of the random effects testlet model (MAT-MTIRT. MAT-MTIRT compared to non-adaptive testing is examined for several combinations of testlet effect variances (0.0, 0.5, 1.0, 1.5 and testlet sizes (3 items, 6 items, 9 items with a simulation study considering three ability dimensions with simple loading structure. MAT-MTIRT outperformed non-adaptive testing regarding the measurement precision of the ability estimates. Further, the measurement precision decreased when testlet effect variances and testlet sizes increased. The suggested combination of the MTIRT model therefore provides a solution to the substantial problems of testlet-based tests while keeping the length of the test within an acceptable range.

  10. A New Adaptive Checkpointing Strategy for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    MENChaoguang; ZUODecheng; YANGXiaozong

    2005-01-01

    Adaptive checkpointing strategy is an efficient recovery scheme, which is suitable for mobile computing system. However, all existing adaptive checkpointing schemes are not correct to recover system when failure occurs in some special period. In this paper, the issues that will lead to system inconsistency are first discussed and then a new adaptive strategy that can recover system to correct consistent state is proposed. Our algorithm improves system recovery performance because only failure process needs rollback through logging.

  11. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  12. Are We Measuring Teachers’ Attitudes towards Computers in Detail?: Adaptation of a Questionnaire into Turkish Culture

    Directory of Open Access Journals (Sweden)

    Nilgün Günbaş

    2017-04-01

    Full Text Available Teachers’ perceptions of computers play an important role in integrating computers into education. The related literature includes studies developing or adapting a survey instrument in Turkish culture measuring teachers’ attitudes toward computers. These instruments have three to four factors (e.g., computer importance, computer enjoyment, computer confidence and 18 to 26 items under these factors. The purpose of the present study is to adapt a more detailed and stronger survey questionnaire measuring more dimensions related to teachers’ attitudes. The source instrument was developed by Christensen and Kenzek (2009 and called Teachers’ Attitudes toward Computers (TAC. It has nine factors with 51 items. Before testing the instrument, the interaction (e-mail factor was taken out because of the cultural differences. The reliability and validity testing of the translated instrument was completed with 273 teachers’ candidates in a Faculty of Education in Turkey. The results showed that the translated instrument (Cronbach’s Alpha: .94 included eight factors and consisted of 42 items under these factors, which were consistent with the original instrument. These factors were: Interest (α: .83, Comfort (α: .90, Accommodation (α: .87, Concern (α: .79, Utility (α: .90, Perception (α: .89, Absorption (α: .84, and Significance (α: .83. Additionally, the confirmatory factor analysis result for the model with eight factors was: RMSEA=0.050, χ2/df=1.69, RMR=0.075, SRMR=0.057, GFI= 0.81, AGFI= 0.78, NFI= 0.94, NNFI=0.97, CFI=0.97, IFI= 0.97. Accordingly, as a reliable, valid and stronger instrument, the adapted survey instrument can be suggested for the use in Turkish academic studies.

  13. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    Science.gov (United States)

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residualsLD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  14. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    Science.gov (United States)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  15. The Role of Item Feedback in Self-Adapted Testing.

    Science.gov (United States)

    Roos, Linda L.; And Others

    1997-01-01

    The importance of item feedback in self-adapted testing was studied by comparing feedback and no feedback conditions for computerized adaptive tests and self-adapted tests taken by 363 college students. Results indicate that item feedback is not necessary to realize score differences between self-adapted and computerized adaptive testing. (SLD)

  16. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  17. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  18. Sequential decision making in computational sustainability via adaptive submodularity

    Science.gov (United States)

    Krause, Andreas; Golovin, Daniel; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  19. Implementing content constraints in alpha-stratified adaptive testing using a shadow test approach

    NARCIS (Netherlands)

    van der Linden, Willem J.; Chang, Hua-Hua

    2001-01-01

    The methods of alpha-stratified adaptive testing and constrained adaptive testing with shadow tests are combined in this study. The advantages are twofold. First, application of the shadow test allows the researcher to implement any type of constraint on item selection in alpha-stratified adaptive

  20. An Adaptive Middleware for Improved Computational Performance

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal

    , we are improving computational performance by exploiting modern hardware features, such as dynamic voltage-frequency scaling and transactional memory. Adapting software is an iterative process, requiring that we continually revisit it to meet new requirements or realities; a time consuming process......The performance improvements in computer systems over the past 60 years have been fueled by an exponential increase in energy efficiency. In recent years, the phenomenon known as the end of Dennard’s scaling has slowed energy efficiency improvements — but improving computer energy efficiency...... is more important now than ever. Traditionally, most improvements in computer energy efficiency have come from improvements in lithography — the ability to produce smaller transistors — and computer architecture - the ability to apply those transistors efficiently. Since the end of scaling, we have seen...

  1. Adaptive testing with equated number-correct scoring

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1999-01-01

    A constrained CAT algorithm is presented that automatically equates the number-correct scores on adaptive tests. The algorithm can be used to equate number-correct scores across different administrations of the same adaptive test as well as to an external reference test. The constraints are derived

  2. Improvement of defect characterization in ultrasonic testing by adaptative learning network

    International Nuclear Information System (INIS)

    Bieth, M.; Adamonis, D.C.; Jusino, A.

    1982-01-01

    Numerous methods exist now for signal analysis in ultrasonic testing. These methods give more or less accurate information for defects characterization. In this paper is presented the development of a particular system based on a computer Signal processing: the Adaptative Learning Network (ALN) allowing the discrimination of defects in function of their nature. The ultrasonic signal is sampled and characterized by parameters amplitude-time and amplitude-frequency. The method was tested on stainless steel tubes welds showing fatigue cracks. The ALN model developed allows, under certain conditions, the discrimination of cracks from other defects [fr

  3. Computerized adaptive testing item selection in computerized adaptive learning systems

    NARCIS (Netherlands)

    Eggen, Theodorus Johannes Hendrikus Maria; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item selection methods traditionally developed for computerized adaptive testing (CAT) are explored for their usefulness in item-based computerized adaptive learning (CAL) systems. While in CAT Fisher information-based selection is optimal, for recovering learning populations in CAL systems item

  4. Applications of decision theory to computer-based adaptive instructional systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1988-01-01

    This paper considers applications of decision theory to the problem of instructional decision-making in computer-based adaptive instructional systems, using the Minnesota Adaptive Instructional System (MAIS) as an example. The first section indicates how the problem of selecting the appropriate

  5. Concentrator optical characterization using computer mathematical modelling and point source testing

    Science.gov (United States)

    Dennison, E. W.; John, S. L.; Trentelman, G. F.

    1984-01-01

    The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.

  6. Flight Test Approach to Adaptive Control Research

    Science.gov (United States)

    Pavlock, Kate Maureen; Less, James L.; Larson, David Nils

    2011-01-01

    The National Aeronautics and Space Administration s Dryden Flight Research Center completed flight testing of adaptive controls research on a full-scale F-18 testbed. The validation of adaptive controls has the potential to enhance safety in the presence of adverse conditions such as structural damage or control surface failures. This paper describes the research interface architecture, risk mitigations, flight test approach and lessons learned of adaptive controls research.

  7. Genre-adaptive Semantic Computing and Audio-based Modelling for Music Mood Annotation

    DEFF Research Database (Denmark)

    Saari, Pasi; Fazekas, György; Eerola, Tuomas

    2016-01-01

    This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling are prop......This study investigates whether taking genre into account is beneficial for automatic music mood annotation in terms of core affects valence, arousal, and tension, as well as several other mood scales. Novel techniques employing genre-adaptive semantic computing and audio-based modelling...... related to a set of 600 popular music tracks spanning multiple genres. The results show that ACTwg outperforms a semantic computing technique that does not exploit genre information, and ACTwg-SLPwg outperforms conventional techniques and other genre-adaptive alternatives. In particular, improvements......-based genre representation for genre-adaptive music mood analysis....

  8. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    Science.gov (United States)

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  9. Microcomputer Network for Computerized Adaptive Testing (CAT)

    Science.gov (United States)

    1984-03-01

    PRDC TR 84-33 \\Q.�d-33- \\ MICROCOMPUTER NETWOJlt FOR COMPUTERIZED ADAPTIVE TESTING ( CAT ) Baldwin Quan Thomas A . Park Gary Sandahl John H...ACCEIIION NO NPRDC TR 84-33 4. TITLE (-d Sul>tlllo) MICROCOMP UTER NETWORK FOR COMPUTERIZED ADA PTIVE TESTING ( CAT ) 1. Q B. uan T. A . Park...adaptive testing ( CAT ) Bayesian sequential testing 20. ABSTitACT (Continuo on ro•••• aide II noco .. _, _., ld-tlly ,.,. t.loclt _._.) DO Computerized

  10. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei; Guyet, Thomas; Quiniou, René ; Cordier, Marie-Odile; Masseglia, Florent; Zhang, Xiangliang

    2014-01-01

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  11. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  12. An Adaptive Approach to Locating Mobile HIV Testing Services.

    Science.gov (United States)

    Gonsalves, Gregg S; Crawford, Forrest W; Cleary, Paul D; Kaplan, Edward H; Paltiel, A David

    2018-02-01

    Public health agencies suggest targeting "hotspots" to identify individuals with undetected HIV infection. However, definitions of hotspots vary. Little is known about how best to target mobile HIV testing resources. We conducted a computer-based tournament to compare the yield of 4 algorithms for mobile HIV testing. Over 180 rounds of play, the algorithms selected 1 of 3 hypothetical zones, each with unknown prevalence of undiagnosed HIV, in which to conduct a fixed number of HIV tests. The algorithms were: 1) Thompson Sampling, an adaptive Bayesian search strategy; 2) Explore-then-Exploit, a strategy that initially draws comparable samples from all zones and then devotes all remaining rounds of play to HIV testing in whichever zone produced the highest observed yield; 3) Retrospection, a strategy using only base prevalence information; and; 4) Clairvoyance, a benchmarking strategy that employs perfect information about HIV prevalence in each zone. Over 250 tournament runs, Thompson Sampling outperformed Explore-then-Exploit 66% of the time, identifying 15% more cases. Thompson Sampling's superiority persisted in a variety of circumstances examined in the sensitivity analysis. Case detection rates using Thompson Sampling were, on average, within 90% of the benchmark established by Clairvoyance. Retrospection was consistently the poorest performer. We did not consider either selection bias (i.e., the correlation between infection status and the decision to obtain an HIV test) or the costs of relocation to another zone from one round of play to the next. Adaptive methods like Thompson Sampling for mobile HIV testing are practical and effective, and may have advantages over other commonly used strategies.

  13. Using response-time constraints in item selection to control for differential speededness in computerized adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Scrams, David J.; Schnipke, Deborah L.

    2003-01-01

    This paper proposes an item selection algorithm that can be used to neutralize the effect of time limits in computer adaptive testing. The method is based on a statistical model for the response-time distributions of the test takers on the items in the pool that is updated each time a new item has

  14. Identifying Students at Risk: An Examination of Computer-Adaptive Measures and Latent Class Growth Analysis

    Science.gov (United States)

    Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.

    2018-01-01

    Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…

  15. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    International Nuclear Information System (INIS)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E; Loukopoulos, Klearchos

    2011-01-01

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  16. Non-adaptive measurement-based quantum computation and multi-party Bell inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Hoban, Matty J; Campbell, Earl T; Browne, Dan E [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); Loukopoulos, Klearchos, E-mail: m.hoban@ucl.ac.uk [Department of Materials, Oxford University, Parks Road, Oxford OX1 4PH (United Kingdom)

    2011-02-15

    Quantum correlations exhibit behaviour that cannot be resolved with a local hidden variable picture of the world. In quantum information, they are also used as resources for information processing tasks, such as measurement-based quantum computation (MQC). In MQC, universal quantum computation can be achieved via adaptive measurements on a suitable entangled resource state. In this paper, we look at a version of MQC in which we remove the adaptivity of measurements and aim to understand what computational abilities remain in the resource. We show that there are explicit connections between this model of computation and the question of non-classicality in quantum correlations. We demonstrate this by focusing on deterministic computation of Boolean functions, in which natural generalizations of the Greenberger-Horne-Zeilinger paradox emerge; we then explore probabilistic computation via, which multipartite Bell inequalities can be defined. We use this correspondence to define families of multi-party Bell inequalities, which we show to have a number of interesting contrasting properties.

  17. Adaptation, testing and application of the two-dimensional FE computer program system for steam generator tube testing

    International Nuclear Information System (INIS)

    Betzold, K.

    1987-01-01

    The 2d-FE computing program system, taken over by EPRI, is used for the improvement of the eddy current test of steam generator heating tubes. The investigations focus on test tasks in the area of the tube plate and the scrap mark; among them: accumulation of mud in the cracking area and above the tube plate; circulating slots with and without accumulation of mud. The interaction of the factors of influence given by the test object and the parameters selectable by the tester as for example coil length and base space for absolute coils and differential coils as well as test frequencies are calculated and the form of the signal locus curves and the dynamic curves are listed in a sample catalogue. It is demonstrated with selected examples that the sample catalogue contributes to the test-specific design of the coil and to the choice of the test frequencies; interpretation of measured signals; deepening of the knowledge of the physical processe in eddy current tests. (orig./HP) [de

  18. An Adaptive Reordered Method for Computing PageRank

    Directory of Open Access Journals (Sweden)

    Yi-Ming Bu

    2013-01-01

    Full Text Available We propose an adaptive reordered method to deal with the PageRank problem. It has been shown that one can reorder the hyperlink matrix of PageRank problem to calculate a reduced system and get the full PageRank vector through forward substitutions. This method can provide a speedup for calculating the PageRank vector. We observe that in the existing reordered method, the cost of the recursively reordering procedure could offset the computational reduction brought by minimizing the dimension of linear system. With this observation, we introduce an adaptive reordered method to accelerate the total calculation, in which we terminate the reordering procedure appropriately instead of reordering to the end. Numerical experiments show the effectiveness of this adaptive reordered method.

  19. Psychometrics behind Computerized Adaptive Testing.

    Science.gov (United States)

    Chang, Hua-Hua

    2015-03-01

    The paper provides a survey of 18 years' progress that my colleagues, students (both former and current) and I made in a prominent research area in Psychometrics-Computerized Adaptive Testing (CAT). We start with a historical review of the establishment of a large sample foundation for CAT. It is worth noting that the asymptotic results were derived under the framework of Martingale Theory, a very theoretical perspective of Probability Theory, which may seem unrelated to educational and psychological testing. In addition, we address a number of issues that emerged from large scale implementation and show that how theoretical works can be helpful to solve the problems. Finally, we propose that CAT technology can be very useful to support individualized instruction on a mass scale. We show that even paper and pencil based tests can be made adaptive to support classroom teaching.

  20. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  1. Exposure Control Using Adaptive Multi-Stage Item Bundles.

    Science.gov (United States)

    Luecht, Richard M.

    This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…

  2. Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... in the aftereffect. The findings have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  3. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  4. Adapting a computer-delivered brief alcohol intervention for veterans with Hepatitis C.

    Science.gov (United States)

    Cucciare, Michael A; Jamison, Andrea L; Combs, Ann S; Joshi, Gauri; Cheung, Ramsey C; Rongey, Catherine; Huggins, Joe; Humphreys, Keith

    2017-12-01

    This study adapted an existing computer-delivered brief alcohol intervention (cBAI) for use in Veterans with the hepatitis C virus (HCV) and examined its acceptability and feasibility in this patient population. A four-stage model consisting of initial pilot testing, qualitative interviews with key stakeholders, development of a beta version of the cBAI, and usability testing was used to achieve the study objectives. In-depth interviews gathered feedback for modifying the cBAI, including adding HCV-related content such as the health effects of alcohol on liver functioning, immune system functioning, and management of HCV, a preference for concepts to be displayed through "newer looking" graphics, and limiting the use of text to convey key concepts. Results from usability testing indicated that the modified cBAI was acceptable and feasible for use in this patient population. The development model used in this study is effective for gathering actionable feedback that can inform the development of a cBAI and can result in the development of an acceptable and feasible intervention for use in this population. Findings also have implications for developing computer-delivered interventions targeting behavior change more broadly.

  5. Computerized adaptive testing--ready for ambulatory monitoring?

    DEFF Research Database (Denmark)

    Rose, Matthias; Bjørner, Jakob; Fischer, Felix

    2012-01-01

    Computerized adaptive tests (CATs) have abundant theoretical advantages over established static instruments, which could improve ambulatory monitoring of patient-reported outcomes (PROs). However, an empirical demonstration of their practical benefits is warranted.......Computerized adaptive tests (CATs) have abundant theoretical advantages over established static instruments, which could improve ambulatory monitoring of patient-reported outcomes (PROs). However, an empirical demonstration of their practical benefits is warranted....

  6. Measurement precision and efficiency of multidimensional computer adaptive testing of physical functioning using the pediatric evaluation of disability inventory.

    Science.gov (United States)

    Haley, Stephen M; Ni, Pengsheng; Ludlow, Larry H; Fragala-Pinkham, Maria A

    2006-09-01

    To compare the measurement efficiency and precision of a multidimensional computer adaptive testing (M-CAT) application to a unidimensional CAT (U-CAT) comparison using item bank data from 2 of the functional skills scales of the Pediatric Evaluation of Disability Inventory (PEDI). Using existing PEDI mobility and self-care item banks, we compared the stability of item calibrations and model fit between unidimensional and multidimensional Rasch models and compared the efficiency and precision of the U-CAT- and M-CAT-simulated assessments to a random draw of items. Pediatric rehabilitation hospital and clinics. Clinical and normative samples. Not applicable. Not applicable. The M-CAT had greater levels of precision and efficiency than the separate mobility and self-care U-CAT versions when using a similar number of items for each PEDI subdomain. Equivalent estimation of mobility and self-care scores can be achieved with a 25% to 40% item reduction with the M-CAT compared with the U-CAT. M-CAT applications appear to have both precision and efficiency advantages compared with separate U-CAT assessments when content subdomains have a high correlation. Practitioners may also realize interpretive advantages of reporting test score information for each subdomain when separate clinical inferences are desired.

  7. Computer-Based Testing: Test Site Security.

    Science.gov (United States)

    Rosen, Gerald A.

    Computer-based testing places great burdens on all involved parties to ensure test security. A task analysis of test site security might identify the areas of protecting the test, protecting the data, and protecting the environment as essential issues in test security. Protecting the test involves transmission of the examinations, identifying the…

  8. Cross-cultural development of an item list for computer-adaptive testing of fatigue in oncological patients

    DEFF Research Database (Denmark)

    Giesinger, Johannes M.; Petersen, Morten Aa.; Grønvold, Mogens

    2011-01-01

    Within an ongoing project of the EORTC Quality of Life Group, we are developing computerized adaptive test (CAT) measures for the QLQ-C30 scales. These new CAT measures are conceptualised to reflect the same constructs as the QLQ-C30 scales. Accordingly, the Fatigue-CAT is intended to capture phy...... physical and general fatigue....

  9. Computer intervention impact on psychosocial adaptation of rural women with chronic conditions.

    Science.gov (United States)

    Weinert, Clarann; Cudney, Shirley; Comstock, Bryan; Bansal, Aasthaa

    2011-01-01

    Adapting to living with chronic conditions is a life-long psychosocial challenge. The purpose of this study was to report the effect of a computer intervention on the psychosocial adaptation of rural women with chronic conditions. A two-group study design was used with 309 middle-aged, rural women who had chronic conditions, randomized into either a computer-based intervention or a control group. Data were collected at baseline, at the end of the intervention, and 6 months later on the psychosocial indicators of social support, self-esteem, acceptance of illness, stress, depression, and loneliness. The impact of the computer-based intervention was statistically significant for five of six of the psychosocial outcomes measured, with a modest impact on social support. The largest benefits were seen in depression, stress, and acceptance. The women-to-women intervention resulted in positive psychosocial responses that have the potential to contribute to successful management of illness and adaptation. Other components of adaptation to be examined are the impact of the intervention on illness management and quality of life and the interrelationships among environmental stimuli, psychosocial response, and illness management.

  10.   Indirect versus direct feedback in computer-based Prism Adaptation Therapy

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Rytter, Hana Malá

    2010-01-01

      Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...... have direct implications for future implementations of computer-based methods of treatment of visuospatial disorders and computer-assisted rehabilitation in general....

  11. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    Science.gov (United States)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  12. Adaptive linear rank tests for eQTL studies.

    Science.gov (United States)

    Szymczak, Silke; Scheinhardt, Markus O; Zeller, Tanja; Wild, Philipp S; Blankenberg, Stefan; Ziegler, Andreas

    2013-02-10

    Expression quantitative trait loci (eQTL) studies are performed to identify single-nucleotide polymorphisms that modify average expression values of genes, proteins, or metabolites, depending on the genotype. As expression values are often not normally distributed, statistical methods for eQTL studies should be valid and powerful in these situations. Adaptive tests are promising alternatives to standard approaches, such as the analysis of variance or the Kruskal-Wallis test. In a two-stage procedure, skewness and tail length of the distributions are estimated and used to select one of several linear rank tests. In this study, we compare two adaptive tests that were proposed in the literature using extensive Monte Carlo simulations of a wide range of different symmetric and skewed distributions. We derive a new adaptive test that combines the advantages of both literature-based approaches. The new test does not require the user to specify a distribution. It is slightly less powerful than the locally most powerful rank test for the correct distribution and at least as powerful as the maximin efficiency robust rank test. We illustrate the application of all tests using two examples from different eQTL studies. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Design and Build an Adapter for Hearing Protector Test

    Directory of Open Access Journals (Sweden)

    Rostam Golmohammadi

    2016-06-01

    Full Text Available Introduction: To determine the effectiveness of hearing protective devices that lack the technical information are one of the major challenges of occupational health experts to judge the impact of this exposure on reducing the level of occupational exposure to noise. The aim of this study was to design a built a hearing test adapter and expriment it to determine the reduction rate of earmuffs and earplugs. Methods: Technical information in real environments and glass industries were Hamadan kitchen garden and guards to ensure exceptional performance test results were compared with computational methods. Results: The results of the testing of Personal hearing protection compared with the results in real industry environment and octave-band method, have shown good regrassions average operating transmission losses. Results showed that the average noise reduction between measured and calculations method for earmuffs 9.3, 8.8 dB and 9.3, 11.2 dB for earplugs respectively. Comparison of the tests, did not show significant differences between the results in tow methods (P>0.05. Conclusion: The results of the testing designed Adaptor for some hearing protectors showed that the valid tool for used to reduction rate teste of earmuffs and earplugs

  14. Procedures for Computing Transonic Flows for Control of Adaptive Wind Tunnels. Ph.D. Thesis - Technische Univ., Berlin, Mar. 1986

    Science.gov (United States)

    Rebstock, Rainer

    1987-01-01

    Numerical methods are developed for control of three dimensional adaptive test sections. The physical properties of the design problem occurring in the external field computation are analyzed, and a design procedure suited for solution of the problem is worked out. To do this, the desired wall shape is determined by stepwise modification of an initial contour. The necessary changes in geometry are determined with the aid of a panel procedure, or, with incident flow near the sonic range, with a transonic small perturbation (TSP) procedure. The designed wall shape, together with the wall deflections set during the tunnel run, are the input to a newly derived one-step formula which immediately yields the adapted wall contour. This is particularly important since the classical iterative adaptation scheme is shown to converge poorly for 3D flows. Experimental results obtained in the adaptive test section with eight flexible walls are presented to demonstrate the potential of the procedure. Finally, a method is described to minimize wall interference in 3D flows by adapting only the top and bottom wind tunnel walls.

  15. A stereotactic adapter compatible with computed tomography

    International Nuclear Information System (INIS)

    Cacak, R.K.; Law, J.D.

    1982-01-01

    One application of computed-tomographic (CT) scanners is the localization of intracranial targets for stereotactic surgery. Unfortunately, conventional stereotactic devices affixed to the patient cause artifacts which obscure anatomic features in CT images. The authors describe the initial phase of a project to eliminate this problem by using an adapter that is free of metallic objects. Localization of the target point relative to the coordinate system of a Leksell stereotactic frame is achieved from CT image measurements

  16. Considerations about expected a posteriori estimation in adaptive testing: adaptive a priori, adaptive correction for bias, and adaptive integration interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    2009-01-01

    In a computerized adaptive test, we would like to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Unfortunately, decreasing the number of items is accompanied by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. The authors suggest that it is possible to reduced the bias, and even the standard error of the estimate, by applying to each provisional estimation one or a combination of the following strategies: adaptive correction for bias proposed by Bock and Mislevy (1982), adaptive a priori estimate, and adaptive integration interval.

  17. Computerized Adaptive Personality Testing: A Review and Illustration With the MMPI-2 Computerized Adaptive Version.

    Science.gov (United States)

    Forbey, Johnathan D.; Ben-Porath, Yossef S.

    2007-01-01

    Computerized adaptive testing in personality assessment can improve efficiency by significantly reducing the number of items administered to answer an assessment question. Two approaches have been explored for adaptive testing in computerized personality assessment: item response theory and the countdown method. In this article, the authors…

  18. Individual Differences in Computerized Adaptive Testing.

    Science.gov (United States)

    Kim, JinGyu

    Research on the major computerized adaptive testing (CAT) strategies is reviewed, and some findings are reported that examine effects of examinee demographic and psychological characteristics on CAT strategies. In fixed branching strategies, all examinees respond to a common routing test, the score of which is used to assign examinees to a…

  19. Water System Adaptation To Hydrological Changes: Module 11, Methods and Tools: Computational Models

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  20. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    2001-01-01

    A model for constrained computerized adaptive testing is proposed in which the information on the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  1. A model for optimal constrained adaptive testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Reese, Lynda M.

    1997-01-01

    A model for constrained computerized adaptive testing is proposed in which the information in the test at the ability estimate is maximized subject to a large variety of possible constraints on the contents of the test. At each item-selection step, a full test is first assembled to have maximum

  2. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    Science.gov (United States)

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  3. Moving finite elements: A continuously adaptive method for computational fluid dynamics

    International Nuclear Information System (INIS)

    Glasser, A.H.; Miller, K.; Carlson, N.

    1991-01-01

    Moving Finite Elements (MFE), a recently developed method for computational fluid dynamics, promises major advances in the ability of computers to model the complex behavior of liquids, gases, and plasmas. Applications of computational fluid dynamics occur in a wide range of scientifically and technologically important fields. Examples include meteorology, oceanography, global climate modeling, magnetic and inertial fusion energy research, semiconductor fabrication, biophysics, automobile and aircraft design, industrial fluid processing, chemical engineering, and combustion research. The improvements made possible by the new method could thus have substantial economic impact. Moving Finite Elements is a moving node adaptive grid method which has a tendency to pack the grid finely in regions where it is most needed at each time and to leave it coarse elsewhere. It does so in a manner which is simple and automatic, and does not require a large amount of human ingenuity to apply it to each particular problem. At the same time, it often allows the time step to be large enough to advance a moving shock by many shock thicknesses in a single time step, moving the grid smoothly with the solution and minimizing the number of time steps required for the whole problem. For 2D problems (two spatial variables) the grid is composed of irregularly shaped and irregularly connected triangles which are very flexible in their ability to adapt to the evolving solution. While other adaptive grid methods have been developed which share some of these desirable properties, this is the only method which combines them all. In many cases, the method can save orders of magnitude of computing time, equivalent to several generations of advancing computer hardware

  4. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  5. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  6. Computational adaptive optics for broadband interferometric tomography of tissues and cells

    Science.gov (United States)

    Adie, Steven G.; Mulligan, Jeffrey A.

    2016-03-01

    Adaptive optics (AO) can shape aberrated optical wavefronts to physically restore the constructive interference needed for high-resolution imaging. With access to the complex optical field, however, many functions of optical hardware can be achieved computationally, including focusing and the compensation of optical aberrations to restore the constructive interference required for diffraction-limited imaging performance. Holography, which employs interferometric detection of the complex optical field, was developed based on this connection between hardware and computational image formation, although this link has only recently been exploited for 3D tomographic imaging in scattering biological tissues. This talk will present the underlying imaging science behind computational image formation with optical coherence tomography (OCT) -- a beam-scanned version of broadband digital holography. Analogous to hardware AO (HAO), we demonstrate computational adaptive optics (CAO) and optimization of the computed pupil correction in 'sensorless mode' (Zernike polynomial corrections with feedback from image metrics) or with the use of 'guide-stars' in the sample. We discuss the concept of an 'isotomic volume' as the volumetric extension of the 'isoplanatic patch' introduced in astronomical AO. Recent CAO results and ongoing work is highlighted to point to the potential biomedical impact of computed broadband interferometric tomography. We also discuss the advantages and disadvantages of HAO vs. CAO for the effective shaping of optical wavefronts, and highlight opportunities for hybrid approaches that synergistically combine the unique advantages of hardware and computational methods for rapid volumetric tomography with cellular resolution.

  7. Translation, Validation, and Reliability of the Dutch Late-Life Function and Disability Instrument Computer Adaptive Test.

    Science.gov (United States)

    Arensman, Remco M; Pisters, Martijn F; de Man-van Ginkel, Janneke M; Schuurmans, Marieke J; Jette, Alan M; de Bie, Rob A

    2016-09-01

    Adequate and user-friendly instruments for assessing physical function and disability in older adults are vital for estimating and predicting health care needs in clinical practice. The Late-Life Function and Disability Instrument Computer Adaptive Test (LLFDI-CAT) is a promising instrument for assessing physical function and disability in gerontology research and clinical practice. The aims of this study were: (1) to translate the LLFDI-CAT to the Dutch language and (2) to investigate its validity and reliability in a sample of older adults who spoke Dutch and dwelled in the community. For the assessment of validity of the LLFDI-CAT, a cross-sectional design was used. To assess reliability, measurement of the LLFDI-CAT was repeated in the same sample. The item bank of the LLFDI-CAT was translated with a forward-backward procedure. A sample of 54 older adults completed the LLFDI-CAT, World Health Organization Disability Assessment Schedule 2.0, RAND 36-Item Short-Form Health Survey physical functioning scale (10 items), and 10-Meter Walk Test. The LLFDI-CAT was repeated in 2 to 8 days (mean=4.5 days). Pearson's r and the intraclass correlation coefficient (ICC) (2,1) were calculated to assess validity, group-level reliability, and participant-level reliability. A correlation of .74 for the LLFDI-CAT function scale and the RAND 36-Item Short-Form Health Survey physical functioning scale (10 items) was found. The correlations of the LLFDI-CAT disability scale with the World Health Organization Disability Assessment Schedule 2.0 and the 10-Meter Walk Test were -.57 and -.53, respectively. The ICC (2,1) of the LLFDI-CAT function scale was .84, with a group-level reliability score of .85. The ICC (2,1) of the LLFDI-CAT disability scale was .76, with a group-level reliability score of .81. The high percentage of women in the study and the exclusion of older adults with recent joint replacement or hospitalization limit the generalizability of the results. The Dutch LLFDI

  8. Computer prediction of subsurface radionuclide transport: an adaptive numerical method

    International Nuclear Information System (INIS)

    Neuman, S.P.

    1983-01-01

    Radionuclide transport in the subsurface is often modeled with the aid of the advection-dispersion equation. A review of existing computer methods for the solution of this equation shows that there is need for improvement. To answer this need, a new adaptive numerical method is proposed based on an Eulerian-Lagrangian formulation. The method is based on a decomposition of the concentration field into two parts, one advective and one dispersive, in a rigorous manner that does not leave room for ambiguity. The advective component of steep concentration fronts is tracked forward with the aid of moving particles clustered around each front. Away from such fronts the advection problem is handled by an efficient modified method of characteristics called single-step reverse particle tracking. When a front dissipates with time, its forward tracking stops automatically and the corresponding cloud of particles is eliminated. The dispersion problem is solved by an unconventional Lagrangian finite element formulation on a fixed grid which involves only symmetric and diagonal matrices. Preliminary tests against analytical solutions of ne- and two-dimensional dispersion in a uniform steady state velocity field suggest that the proposed adaptive method can handle the entire range of Peclet numbers from 0 to infinity, with Courant numbers well in excess of 1

  9. The use of computer adaptive tests in outcome assessments following upper limb trauma.

    Science.gov (United States)

    Jayakumar, P; Overbeek, C; Vranceanu, A-M; Williams, M; Lamb, S; Ring, D; Gwilym, S

    2018-06-01

    Aims Outcome measures quantifying aspects of health in a precise, efficient, and user-friendly manner are in demand. Computer adaptive tests (CATs) may overcome the limitations of established fixed scales and be more adept at measuring outcomes in trauma. The primary objective of this review was to gain a comprehensive understanding of the psychometric properties of CATs compared with fixed-length scales in the assessment of outcome in patients who have suffered trauma of the upper limb. Study designs, outcome measures and methodological quality are defined, along with trends in investigation. Materials and Methods A search of multiple electronic databases was undertaken on 1 January 2017 with terms related to "CATs", "orthopaedics", "trauma", and "anatomical regions". Studies involving adults suffering trauma to the upper limb, and undergoing any intervention, were eligible. Those involving the measurement of outcome with any CATs were included. Identification, screening, and eligibility were undertaken, followed by the extraction of data and quality assessment using the Consensus-Based Standards for the Selection of Health Measurement Instruments (COSMIN) criteria. The review is reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) criteria and reg istered (PROSPERO: CRD42016053886). Results A total of 31 studies reported trauma conditions alone, or in combination with non-traumatic conditions using CATs. Most were cross-sectional with varying level of evidence, number of patients, type of study, range of conditions and methodological quality. CATs correlated well with fixed scales and had minimal or no floor-ceiling effects. They required significantly fewer questions and/or less time for completion. Patient-Reported Outcomes Measurement Information System (PROMIS) CATs were the most frequently used, and the use of CATs is increasing. Conclusion Early studies show valid and reliable outcome measurement with CATs

  10. Adaptive Remodeling of Achilles Tendon: A Multi-scale Computational Model.

    Directory of Open Access Journals (Sweden)

    Stuart R Young

    2016-09-01

    Full Text Available While it is known that musculotendon units adapt to their load environments, there is only a limited understanding of tendon adaptation in vivo. Here we develop a computational model of tendon remodeling based on the premise that mechanical damage and tenocyte-mediated tendon damage and repair processes modify the distribution of its collagen fiber lengths. We explain how these processes enable the tendon to geometrically adapt to its load conditions. Based on known biological processes, mechanical and strain-dependent proteolytic fiber damage are incorporated into our tendon model. Using a stochastic model of fiber repair, it is assumed that mechanically damaged fibers are repaired longer, whereas proteolytically damaged fibers are repaired shorter, relative to their pre-damage length. To study adaptation of tendon properties to applied load, our model musculotendon unit is a simplified three-component Hill-type model of the human Achilles-soleus unit. Our model results demonstrate that the geometric equilibrium state of the Achilles tendon can coincide with minimization of the total metabolic cost of muscle activation. The proposed tendon model independently predicts rates of collagen fiber turnover that are in general agreement with in vivo experimental measurements. While the computational model here only represents a first step in a new approach to understanding the complex process of tendon remodeling in vivo, given these findings, it appears likely that the proposed framework may itself provide a useful theoretical foundation for developing valuable qualitative and quantitative insights into tendon physiology and pathology.

  11. Processing-Efficient Distributed Adaptive RLS Filtering for Computationally Constrained Platforms

    Directory of Open Access Journals (Sweden)

    Noor M. Khan

    2017-01-01

    Full Text Available In this paper, a novel processing-efficient architecture of a group of inexpensive and computationally incapable small platforms is proposed for a parallely distributed adaptive signal processing (PDASP operation. The proposed architecture runs computationally expensive procedures like complex adaptive recursive least square (RLS algorithm cooperatively. The proposed PDASP architecture operates properly even if perfect time alignment among the participating platforms is not available. An RLS algorithm with the application of MIMO channel estimation is deployed on the proposed architecture. Complexity and processing time of the PDASP scheme with MIMO RLS algorithm are compared with sequentially operated MIMO RLS algorithm and liner Kalman filter. It is observed that PDASP scheme exhibits much lesser computational complexity parallely than the sequential MIMO RLS algorithm as well as Kalman filter. Moreover, the proposed architecture provides an improvement of 95.83% and 82.29% decreased processing time parallely compared to the sequentially operated Kalman filter and MIMO RLS algorithm for low doppler rate, respectively. Likewise, for high doppler rate, the proposed architecture entails an improvement of 94.12% and 77.28% decreased processing time compared to the Kalman and RLS algorithms, respectively.

  12. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    Science.gov (United States)

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  13. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    International Nuclear Information System (INIS)

    Jiang, Steve B.; Wolfgang, John; Mageras, Gig S.

    2008-01-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed

  14. Statistical tests for person misfit in computerized adaptive testing

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Meijer, R.R.; van Krimpen-Stoop, Edith

    1998-01-01

    Recently, several person-fit statistics have been proposed to detect nonfitting response patterns. This study is designed to generalize an approach followed by Klauer (1995) to an adaptive testing system using the two-parameter logistic model (2PL) as a null model. The approach developed by Klauer

  15. Adaptive Tests of Significance Using Permutations of Residuals with R and SAS

    CERN Document Server

    O'Gorman, Thomas W

    2012-01-01

    Provides the tools needed to successfully perform adaptive tests across a broad range of datasets Adaptive Tests of Significance Using Permutations of Residuals with R and SAS illustrates the power of adaptive tests and showcases their ability to adjust the testing method to suit a particular set of data. The book utilizes state-of-the-art software to demonstrate the practicality and benefits for data analysis in various fields of study. Beginning with an introduction, the book moves on to explore the underlying concepts of adaptive tests, including:Smoothing methods and normalizing transforma

  16. A Secure, Scalable and Elastic Autonomic Computing Systems Paradigm: Supporting Dynamic Adaptation of Self-* Services from an Autonomic Cloud

    Directory of Open Access Journals (Sweden)

    Abdul Jaleel

    2018-05-01

    Full Text Available Autonomic computing embeds self-management features in software systems using external feedback control loops, i.e., autonomic managers. In existing models of autonomic computing, adaptive behaviors are defined at the design time, autonomic managers are statically configured, and the running system has a fixed set of self-* capabilities. An autonomic computing design should accommodate autonomic capability growth by allowing the dynamic configuration of self-* services, but this causes security and integrity issues. A secure, scalable and elastic autonomic computing system (SSE-ACS paradigm is proposed to address the runtime inclusion of autonomic managers, ensuring secure communication between autonomic managers and managed resources. Applying the SSE-ACS concept, a layered approach for the dynamic adaptation of self-* services is presented with an online ‘Autonomic_Cloud’ working as the middleware between Autonomic Managers (offering the self-* services and Autonomic Computing System (requiring the self-* services. A stock trading and forecasting system is used for simulation purposes. The security impact of the SSE-ACS paradigm is verified by testing possible attack cases over the autonomic computing system with single and multiple autonomic managers running on the same and different machines. The common vulnerability scoring system (CVSS metric shows a decrease in the vulnerability severity score from high (8.8 for existing ACS to low (3.9 for SSE-ACS. Autonomic managers are introduced into the system at runtime from the Autonomic_Cloud to test the scalability and elasticity. With elastic AMs, the system optimizes the Central Processing Unit (CPU share resulting in an improved execution time for business logic. For computing systems requiring the continuous support of self-management services, the proposed system achieves a significant improvement in security, scalability, elasticity, autonomic efficiency, and issue resolving time

  17. Modifications to Langley 0.3-m TCT adaptive wall software for heavy gas test medium, phase 1 studies

    Science.gov (United States)

    Murthy, A. V.

    1992-01-01

    The scheme for two-dimensional wall adaptation with sulfur hexafluoride (SF6) as test gas in the NASA Langley Research Center 0.3-m Transonic Cryogenic Tunnel (0.3-m TCT) is presented. A unified version of the wall adaptation software has been developed to function in a dual gas operation mode (nitrogen or SF6). The feature of ideal gas calculations for nitrogen operation is retained. For SF6 operation, real gas properties have been computed using the departure function technique. Installation of the software on the 0.3-m TCT ModComp-A computer and preliminary validation with nitrogen operation were found to be satisfactory. Further validation and improvements to the software will be undertaken when the 0.3-m TCT is ready for operation with SF6 gas.

  18. A Computer Adaptive Testing Version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT

    Science.gov (United States)

    Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.

    2017-01-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387

  19. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    Science.gov (United States)

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Online adaptation of a c-VEP Brain-computer Interface(BCI) based on error-related potentials and unsupervised learning.

    Science.gov (United States)

    Spüler, Martin; Rosenstiel, Wolfgang; Bogdan, Martin

    2012-01-01

    The goal of a Brain-Computer Interface (BCI) is to control a computer by pure brain activity. Recently, BCIs based on code-modulated visual evoked potentials (c-VEPs) have shown great potential to establish high-performance communication. In this paper we present a c-VEP BCI that uses online adaptation of the classifier to reduce calibration time and increase performance. We compare two different approaches for online adaptation of the system: an unsupervised method and a method that uses the detection of error-related potentials. Both approaches were tested in an online study, in which an average accuracy of 96% was achieved with adaptation based on error-related potentials. This accuracy corresponds to an average information transfer rate of 144 bit/min, which is the highest bitrate reported so far for a non-invasive BCI. In a free-spelling mode, the subjects were able to write with an average of 21.3 error-free letters per minute, which shows the feasibility of the BCI system in a normal-use scenario. In addition we show that a calibration of the BCI system solely based on the detection of error-related potentials is possible, without knowing the true class labels.

  1. Analysing Test-Takers’ Views on a Computer-Based Speaking Test

    Directory of Open Access Journals (Sweden)

    Marian Amengual-Pizarro

    2017-11-01

    Full Text Available This study examines test-takers’ views on a computer-delivered speaking test in order to investigate the aspects they consider most relevant in technology-based oral assessment, and to explore the main advantages and disadvantages computer-based tests may offer as compared to face-to-face speaking tests. A small-scale open questionnaire was administered to 80 test-takers who took the APTIS speaking test at the Universidad de Alcalá in April 2016. Results reveal that examinees believe computer-based tests provide a valid measure of oral competence in English and are considered to be an adequate method for the assessment of speaking. Interestingly, the data suggest that personal characteristics of test-takers seem to play a key role in deciding upon the most suitable and reliable delivery mode.

  2. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  3. Development and Evaluation of a Confidence-Weighting Computerized Adaptive Testing

    Science.gov (United States)

    Yen, Yung-Chin; Ho, Rong-Guey; Chen, Li-Ju; Chou, Kun-Yi; Chen, Yan-Lin

    2010-01-01

    The purpose of this study was to examine whether the efficiency, precision, and validity of computerized adaptive testing (CAT) could be improved by assessing confidence differences in knowledge that examinees possessed. We proposed a novel polytomous CAT model called the confidence-weighting computerized adaptive testing (CWCAT), which combined a…

  4. A computer simulation of an adaptive noise canceler with a single input

    Science.gov (United States)

    Albert, Stuart D.

    1991-06-01

    A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.

  5. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca

    2015-01-01

    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  6. Towards Static Analysis of Policy-Based Self-adaptive Computing Systems

    DEFF Research Database (Denmark)

    Margheri, Andrea; Nielson, Hanne Riis; Nielson, Flemming

    2016-01-01

    For supporting the design of self-adaptive computing systems, the PSCEL language offers a principled approach that relies on declarative definitions of adaptation and authorisation policies enforced at runtime. Policies permit managing system components by regulating their interactions...... and by dynamically introducing new actions to accomplish task-oriented goals. However, the runtime evaluation of policies and their effects on system components make the prediction of system behaviour challenging. In this paper, we introduce the construction of a flow graph that statically points out the policy...... evaluations that can take place at runtime and exploit it to analyse the effects of policy evaluations on the progress of system components....

  7. A computer-controlled automated test system for fatigue and fracture testing

    International Nuclear Information System (INIS)

    Nanstad, R.K.; Alexander, D.J.; Swain, R.L.; Hutton, J.T.; Thomas, D.L.

    1989-01-01

    A computer-controlled system consisting of a servohydraulic test machine, an in-house designed test controller, and a desktop computer has been developed for performing automated fracture toughness and fatigue crack growth testing both in the laboratory and in hot cells for remote testing of irradiated specimens. Both unloading compliance and dc-potential drop can be used to monitor crack growth. The test controller includes a dc-current supply programmer, a function generator for driving the servohydraulic test machine to required test outputs, five measurement channels (each consisting of low-pass filter, track/hold amplifier, and 16-bit analog-to-digital converter), and digital logic for various control and data multiplexing functions. The test controller connects to the computer via a 16-bit wide photo-isolated bidirectional bus. The computer, a Hewlett-Packard series 200/300, inputs specimen and test parameters from the operator, configures the test controller, stores test data from the test controller in memory, does preliminary analysis during the test, and records sensor calibrations, specimen and test parameters, and test data on flexible diskette for later recall and analysis with measured initial and final crack length information. During the test, the operator can change test parameters as necessary. 24 refs., 6 figs

  8. Design of an Adaptive Power Regulation Mechanism and a Nozzle for a Hydroelectric Power Plant Turbine Test Rig

    Science.gov (United States)

    Mert, Burak; Aytac, Zeynep; Tascioglu, Yigit; Celebioglu, Kutay; Aradag, Selin; ETU Hydro Research Center Team

    2014-11-01

    This study deals with the design of a power regulation mechanism for a Hydroelectric Power Plant (HEPP) model turbine test system which is designed to test Francis type hydroturbines up to 2 MW power with varying head and flow(discharge) values. Unlike the tailor made regulation mechanisms of full-sized, functional HEPPs; the design for the test system must be easily adapted to various turbines that are to be tested. In order to achieve this adaptability, a dynamic simulation model is constructed in MATLAB/Simulink SimMechanics. This model acquires geometric data and hydraulic loading data of the regulation system from Autodesk Inventor CAD models and Computational Fluid Dynamics (CFD) analysis respectively. The dynamic model is explained and case studies of two different HEPPs are performed for validation. CFD aided design of the turbine guide vanes, which is used as input for the dynamic model, is also presented. This research is financially supported by Turkish Ministry of Development.

  9. Adaptive Dynamic Process Scheduling on Distributed Memory Parallel Computers

    Directory of Open Access Journals (Sweden)

    Wei Shu

    1994-01-01

    Full Text Available One of the challenges in programming distributed memory parallel machines is deciding how to allocate work to processors. This problem is particularly important for computations with unpredictable dynamic behaviors or irregular structures. We present a scheme for dynamic scheduling of medium-grained processes that is useful in this context. The adaptive contracting within neighborhood (ACWN is a dynamic, distributed, load-dependent, and scalable scheme. It deals with dynamic and unpredictable creation of processes and adapts to different systems. The scheme is described and contrasted with two other schemes that have been proposed in this context, namely the randomized allocation and the gradient model. The performance of the three schemes on an Intel iPSC/2 hypercube is presented and analyzed. The experimental results show that even though the ACWN algorithm incurs somewhat larger overhead than the randomized allocation, it achieves better performance in most cases due to its adaptiveness. Its feature of quickly spreading the work helps it outperform the gradient model in performance and scalability.

  10. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  11. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  12. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  13. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  14. Multidimensional adaptive testing with a minimum error-variance criterion

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1997-01-01

    The case of adaptive testing under a multidimensional logistic response model is addressed. An adaptive algorithm is proposed that minimizes the (asymptotic) variance of the maximum-likelihood (ML) estimator of a linear combination of abilities of interest. The item selection criterion is a simple

  15. Validation of Patient Reported Outcomes Measurement Information System (PROMIS) Computer Adaptive Tests (CATs) in the Surgical Treatment of Lumbar Spinal Stenosis.

    Science.gov (United States)

    Patel, Alpesh A; Dodwad, Shah-Nawaz M; Boody, Barrett S; Bhatt, Surabhi; Savage, Jason W; Hsu, Wellington K; Rothrock, Nan E

    2018-03-19

    Prospective, cohort study. Demonstrate validity of PROMIS physical function, pain interference, and pain behavior computer adaptive tests (CATs) in surgically treated lumbar stenosis patients. There has been increasing attention given to patient reported outcomes associated with spinal interventions. Historical patient outcome measures have inadequate validation, demonstrate floor/ceiling effects, and infrequently used due to time constraints. PROMIS is an adaptive, responsive NIH assessment tool that measures patient-reported health status. 98 consecutive patients were surgically treated for lumbar spinal stenosis and were assessed using PROMIS CATs, ODI, ZCQ and SF-12. Prior lumbar surgery, history of scoliosis, cancer, trauma, or infection were excluded. Completion time, preoperative assessment, 6 week and 3 month postoperative scores were collected. At baseline, 49%, 79%, and 81% of patients had PROMIS PB, PI, and PF scores greater than 1 SD worse than the general population. 50.6% were categorized as severely disabled, crippled, or bed bound by ODI. PROMIS CATs demonstrated convergent validity through moderate to high correlations with legacy measures (r = 0.35-0.73). PROMIS CATs demonstrated known groups validity when stratified by ODI levels of disability. ODI improvements of at least 10 points on average had changes in PROMIS scores in the expected direction (PI = -12.98, PB = -9.74, PF = 7.53). PROMIS CATs demonstrated comparable responsiveness to change when evaluated against legacy measures. PROMIS PB and PI decreased 6.66 and 9.62 and PROMIS PF increased 6.8 points between baseline and 3-months post-op (p validity, known groups validity, and responsiveness for surgically treated patients with lumbar stenosis to detect change over time and are more efficient than legacy instruments. 2.

  16. Estimation of an Examinee's Ability in the Web-Based Computerized Adaptive Testing Program IRT-CAT

    Directory of Open Access Journals (Sweden)

    Yoon-Hwan Lee

    2006-11-01

    Full Text Available We developed a program to estimate an examinee's ability in order to provide freely available access to a web-based computerized adaptive testing (CAT program. We used PHP and Java Script as the program languages, PostgresSQL as the database management system on an Apache web server and Linux as the operating system. A system which allows for user input and searching within inputted items and creates tests was constructed. We performed an ability estimation on each test based on a Rasch model and 2- or 3-parametric logistic models. Our system provides an algorithm for a web-based CAT, replacing previous personal computer-based ones, and makes it possible to estimate an examinee?占퐏 ability immediately at the end of test.

  17. Turning the Page on Pen-and-Paper Questionnaires: Combining Ecological Momentary Assessment and Computer Adaptive Testing to Transform Psychological Assessment in the 21st Century.

    Science.gov (United States)

    Gibbons, Chris J

    2016-01-01

    The current paper describes new opportunities for patient-centred assessment methods which have come about by the increased adoption of affordable smart technologies in biopsychosocial research and medical care. In this commentary, we review modern assessment methods including item response theory (IRT), computer adaptive testing (CAT), and ecological momentary assessment (EMA) and explain how these methods may be combined to improve psychological assessment. We demonstrate both how a 'naïve' selection of a small group of items in an EMA can lead to unacceptably unreliable assessments and how IRT can provide detailed information on the individual information that each item gives thus allowing short form assessments to be selected with acceptable reliability. The combination of CAT and IRT can ensure assessments are precise, efficient, and well targeted to the individual; allowing EMAs to be both brief and accurate.

  18. Adaptation of electrical conductivity test for Moringa oleifera seeds

    Directory of Open Access Journals (Sweden)

    Maria Luiza de Souza Medeiros

    2017-09-01

    Full Text Available This study aimed to adapt and test the efficiency of electrical conductivity methodology test in quality evaluation of Moringa oleifera Lam seeds. For physiological characterization four seed sets were evaluated by tests of germination, seedlings emergency, speed of emergency index, emergency first count, seedlings length and dry mass and cold test. The electrical conductivity test was carried out at 25 °C for 4, 8, 12, 16 and 24 h of immersion in 75 or 125 mL of distilled water using 25 or 50 seeds. A completely randomized design was used. The best results were obtained when using 50 seeds immersed in 75 mL or 125 mL of distilled water for 4 h. The electrical conductivity test adapted to moringa seeds was efficient in ranking sets of different vigor levels. The test may be efficiently used for physiological quality evaluation of moringa seeds.

  19. Procedures for Selecting Items for Computerized Adaptive Tests.

    Science.gov (United States)

    Kingsbury, G. Gage; Zara, Anthony R.

    1989-01-01

    Several classical approaches and alternative approaches to item selection for computerized adaptive testing (CAT) are reviewed and compared. The study also describes procedures for constrained CAT that may be added to classical item selection approaches to allow them to be used for applied testing. (TJH)

  20. CUSUM-based person-fit statistics for adaptive testing

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    2001-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be inaccurately estimated. Several person-fit statistics for detecting nonfitting score patterns for paper-and-pencil tests have been proposed. In the context of computerized adaptive tests (CAT),

  1. CUSUM-based person-fit statistics for adaptive testing

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    1999-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be estimated inaccurately. Several person-fit statistics for detecting nonfitting score patterns for paper-and-pencil tests have been proposed. In the context of computerized adaptive tests (CAT),

  2. Method and system for environmentally adaptive fault tolerant computing

    Science.gov (United States)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  3. An adaptive Mantel-Haenszel test for sensitivity analysis in observational studies.

    Science.gov (United States)

    Rosenbaum, Paul R; Small, Dylan S

    2017-06-01

    In a sensitivity analysis in an observational study with a binary outcome, is it better to use all of the data or to focus on subgroups that are expected to experience the largest treatment effects? The answer depends on features of the data that may be difficult to anticipate, a trade-off between unknown effect-sizes and known sample sizes. We propose a sensitivity analysis for an adaptive test similar to the Mantel-Haenszel test. The adaptive test performs two highly correlated analyses, one focused analysis using a subgroup, one combined analysis using all of the data, correcting for multiple testing using the joint distribution of the two test statistics. Because the two component tests are highly correlated, this correction for multiple testing is small compared with, for instance, the Bonferroni inequality. The test has the maximum design sensitivity of two component tests. A simulation evaluates the power of a sensitivity analysis using the adaptive test. Two examples are presented. An R package, sensitivity2x2xk, implements the procedure. © 2016, The International Biometric Society.

  4. Adaptive Statistical Iterative Reconstruction-V Versus Adaptive Statistical Iterative Reconstruction: Impact on Dose Reduction and Image Quality in Body Computed Tomography.

    Science.gov (United States)

    Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo

    The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.

  5. PEAC: A Power-Efficient Adaptive Computing Technology for Enabling Swarm of Small Spacecraft and Deployable Mini-Payloads

    Data.gov (United States)

    National Aeronautics and Space Administration — This task is to develop and demonstrate a path-to-flight and power-adaptive avionics technology PEAC (Power Efficient Adaptive Computing). PEAC will enable emerging...

  6. Computational adaptive optics for broadband optical interferometric tomography of biological tissue.

    Science.gov (United States)

    Adie, Steven G; Graf, Benedikt W; Ahmad, Adeel; Carney, P Scott; Boppart, Stephen A

    2012-05-08

    Aberrations in optical microscopy reduce image resolution and contrast, and can limit imaging depth when focusing into biological samples. Static correction of aberrations may be achieved through appropriate lens design, but this approach does not offer the flexibility of simultaneously correcting aberrations for all imaging depths, nor the adaptability to correct for sample-specific aberrations for high-quality tomographic optical imaging. Incorporation of adaptive optics (AO) methods have demonstrated considerable improvement in optical image contrast and resolution in noninterferometric microscopy techniques, as well as in optical coherence tomography. Here we present a method to correct aberrations in a tomogram rather than the beam of a broadband optical interferometry system. Based on Fourier optics principles, we correct aberrations of a virtual pupil using Zernike polynomials. When used in conjunction with the computed imaging method interferometric synthetic aperture microscopy, this computational AO enables object reconstruction (within the single scattering limit) with ideal focal-plane resolution at all depths. Tomographic reconstructions of tissue phantoms containing subresolution titanium-dioxide particles and of ex vivo rat lung tissue demonstrate aberration correction in datasets acquired with a highly astigmatic illumination beam. These results also demonstrate that imaging with an aberrated astigmatic beam provides the advantage of a more uniform depth-dependent signal compared to imaging with a standard gaussian beam. With further work, computational AO could enable the replacement of complicated and expensive optical hardware components with algorithms implemented on a standard desktop computer, making high-resolution 3D interferometric tomography accessible to a wider group of users and nonspecialists.

  7. The students' ability in the mathematical literacy for uncertainty problems on the PISA adaptation test

    Science.gov (United States)

    Julie, Hongki; Sanjaya, Febi; Anggoro, Ant. Yudhi

    2017-08-01

    One of purposes of this study was to describe the solution profile of the junior high school students for the PISA adaptation test. The procedures conducted by researchers to achieve this objective were (1) adapting the PISA test, (2) validating the adapting PISA test, (3) asking junior high school students to do the adapting PISA test, and (4) making the students' solution profile. The PISA problems for mathematics could be classified into four areas, namely quantity, space and shape, change and relationship, and uncertainty. The research results that would be presented in this paper were the result test for uncertainty problems. In the adapting PISA test, there were fifteen questions. Subjects in this study were 18 students from 11 junior high schools in Yogyakarta, Central Java, and Banten. The type of research that used by the researchers was a qualitative research. For the first uncertainty problem in the adapting test, 66.67% of students reached level 3. For the second uncertainty problem in the adapting test, 44.44% of students achieved level 4, and 33.33% of students reached level 3. For the third uncertainty problem in the adapting test n, 38.89% of students achieved level 5, 11.11% of students reached level 4, and 5.56% of students achieved level 3. For the part a of the fourth uncertainty problem in the adapting test, 72.22% of students reached level 4 and for the part b of the fourth uncertainty problem in the adapting test, 83.33% students achieved level 4.

  8. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  9. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    Science.gov (United States)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  10. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    Directory of Open Access Journals (Sweden)

    Lee Mike Myung-Ok

    2006-01-01

    Full Text Available This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch through an indium bump interconnection array (IBIA. The configurable array processor (CAP is an array of heterogeneous processing elements (PEs, while the intelligent configurable switch (ICS comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  11. Configurable multiplier modules for an adaptive computing system

    Directory of Open Access Journals (Sweden)

    O. A. Pfänder

    2006-01-01

    Full Text Available The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  12. Functional reach and lateral reach tests adapted for aquatic physical therapy

    Directory of Open Access Journals (Sweden)

    Ana Angélica Ribeiro de Lima

    Full Text Available Abstract Introduction: Functional reach (FR and lateral reach (LR tests are widely used in scientific research and clinical practice. Assessment tools are useful in assessing subjects with greater accuracy and are usually adapted according to the limitations of each condition. Objective: To adapt FR and LR tests for use in an aquatic environment and assess the performance of healthy young adults. Methods: We collected anthropometric data and information on whether the participant exercised regularly or not. The FR and LR tests were adapted for use in an aquatic environment and administered to 47 healthy subjects aged 20-30 years. Each test was repeated three times. Results: Forty-one females and six males were assessed. The mean FR test score for men was 24.06 cm, whereas the mean value for right lateral reach (RLR was 10.94 cm and for left lateral reach (LLR was 9.78 cm. For females, the mean FR score was 17.57 cm, while the mean values for RLR was 8.84cm and for LLR was 7.76 cm. Men performed better in the FR (p < 0.001 and RLR tests than women (p = 0.037. Individuals who exercised regularly showed no differences in performance level when compared with their counterparts. Conclusion: The FR and LR tests were adapted for use in an aquatic environment. Males performed better on the FR and RLR tests, when compared to females. There was no correlation between the FR and LR tests and weight, height, Body Mass Index (BMI, foot length or length of the dominant upper limb.

  13. Development of the adaptive music perception test.

    Science.gov (United States)

    Kirchberger, Martin J; Russo, Frank A

    2015-01-01

    Despite vast amounts of research examining the influence of hearing loss on speech perception, comparatively little is known about its influence on music perception. No standardized test exists to quantify music perception of hearing-impaired (HI) persons in a clinically practical manner. This study presents the Adaptive Music Perception (AMP) test as a tool to assess important aspects of music perception with hearing loss. A computer-driven test was developed to determine the discrimination thresholds of 10 low-level physical dimensions (e.g., duration, level) in the context of perceptual judgments about musical dimensions: meter, harmony, melody, and timbre. In the meter test, the listener is asked to judge whether a tone sequence is duple or triple in meter. The harmony test requires that the listener make judgments about the stability of the chord sequences. In the melody test, the listener must judge whether a comparison melody is the same as a standard melody when presented in transposition and in the context of a chordal accompaniment that serves as a mask. The timbre test requires that the listener determines which of two comparison tones is different in timbre from a standard tone (ABX design). Twenty-one HI participants and 19 normal-hearing (NH) participants were recruited to carry out the music tests. Participants were tested twice on separate occasions to evaluate test-retest reliability. The HI group had significantly higher discrimination thresholds than the NH group in 7 of the 10 low-level physical dimensions: frequency discrimination in the meter test, dissonance and intonation perception in the harmony test, melody-to-chord ratio for both melody types in the melody test, and the perception of brightness and spectral irregularity in the timbre test. Small but significant improvement between test and retest was observed in three dimensions: frequency discrimination (meter test), dissonance (harmony test), and attack length (timbre test). All other

  14. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  15. A procedure for empirical initialization of adaptive testing algorithms

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1997-01-01

    In constrained adaptive testing, the numbers of constraints needed to control the content of the tests can easily run into the hundreds. Proper initialization of the algorithm becomes a requirement because the presence of large numbers of constraints slows down the convergence of the ability

  16. Construction of a 2- by 2-foot transonic adaptive-wall test section at the NASA Ames Research Center

    Science.gov (United States)

    Morgan, Daniel G.; Lee, George

    1986-01-01

    The development of a new production-size, two-dimensional, adaptive-wall test section with ventilated walls at the NASA Ames Research Center is described. The new facility incorporates rapid closed-loop operation, computer/sensor integration, and on-line interference assessment and wall corrections. Air flow through the test section is controlled by a series of plenum compartments and three-way slide vales. A fast-scan laser velocimeter was built to measure velocity boundary conditions for the interference assessment scheme. A 15.2-cm- (6.0-in.-) chord NACA 0012 airfoil model will be used in the first experiments during calibration of the facility.

  17. Self-Testing Computer Memory

    Science.gov (United States)

    Chau, Savio, N.; Rennels, David A.

    1988-01-01

    Memory system for computer repeatedly tests itself during brief, regular interruptions of normal processing of data. Detects and corrects transient faults as single-event upsets (changes in bits due to ionizing radiation) within milliseconds after occuring. Self-testing concept surpasses conventional by actively flushing latent defects out of memory and attempting to correct before accumulating beyond capacity for self-correction or detection. Cost of improvement modest increase in complexity of circuitry and operating time.

  18. Adaptive testing for making unidimensional and multidimensional classification decisions

    NARCIS (Netherlands)

    van Groen, M.M.

    2014-01-01

    Computerized adaptive tests (CATs) were originally developed to obtain an efficient estimate of the examinee’s ability, but they can also be used to classify the examinee into one of two or more levels (e.g. master/non-master). These computerized classification tests have the advantage that they can

  19. Spatial co-adaptation of cortical control columns in a micro-ECoG brain-computer interface

    Science.gov (United States)

    Rouse, A. G.; Williams, J. J.; Wheeler, J. J.; Moran, D. W.

    2016-10-01

    Objective. Electrocorticography (ECoG) has been used for a range of applications including electrophysiological mapping, epilepsy monitoring, and more recently as a recording modality for brain-computer interfaces (BCIs). Studies that examine ECoG electrodes designed and implanted chronically solely for BCI applications remain limited. The present study explored how two key factors influence chronic, closed-loop ECoG BCI: (i) the effect of inter-electrode distance on BCI performance and (ii) the differences in neural adaptation and performance when fixed versus adaptive BCI decoding weights are used. Approach. The amplitudes of epidural micro-ECoG signals between 75 and 105 Hz with 300 μm diameter electrodes were used for one-dimensional and two-dimensional BCI tasks. The effect of inter-electrode distance on BCI control was tested between 3 and 15 mm. Additionally, the performance and cortical modulation differences between constant, fixed decoding using a small subset of channels versus adaptive decoding weights using the entire array were explored. Main results. Successful BCI control was possible with two electrodes separated by 9 and 15 mm. Performance decreased and the signals became more correlated when the electrodes were only 3 mm apart. BCI performance in a 2D BCI task improved significantly when using adaptive decoding weights (80%-90%) compared to using constant, fixed weights (50%-60%). Additionally, modulation increased for channels previously unavailable for BCI control under the fixed decoding scheme upon switching to the adaptive, all-channel scheme. Significance. Our results clearly show that neural activity under a BCI recording electrode (which we define as a ‘cortical control column’) readily adapts to generate an appropriate control signal. These results show that the practical minimal spatial resolution of these control columns with micro-ECoG BCI is likely on the order of 3 mm. Additionally, they show that the combination and

  20. Adaptive versus Non-Adaptive Security of Multi-Party Protocols

    DEFF Research Database (Denmark)

    Canetti, Ran; Damgård, Ivan Bjerre; Dziembowski, Stefan

    2004-01-01

    Security analysis of multi-party cryptographic protocols distinguishes between two types of adversarial settings: In the non-adaptive setting the set of corrupted parties is chosen in advance, before the interaction begins. In the adaptive setting the adversary chooses who to corrupt during...... the course of the computation. We study the relations between adaptive security (i.e., security in the adaptive setting) and nonadaptive security, according to two definitions and in several models of computation....

  1. An efficient Adaptive Mesh Refinement (AMR) algorithm for the Discontinuous Galerkin method: Applications for the computation of compressible two-phase flows

    Science.gov (United States)

    Papoutsakis, Andreas; Sazhin, Sergei S.; Begg, Steven; Danaila, Ionut; Luddens, Francky

    2018-06-01

    We present an Adaptive Mesh Refinement (AMR) method suitable for hybrid unstructured meshes that allows for local refinement and de-refinement of the computational grid during the evolution of the flow. The adaptive implementation of the Discontinuous Galerkin (DG) method introduced in this work (ForestDG) is based on a topological representation of the computational mesh by a hierarchical structure consisting of oct- quad- and binary trees. Adaptive mesh refinement (h-refinement) enables us to increase the spatial resolution of the computational mesh in the vicinity of the points of interest such as interfaces, geometrical features, or flow discontinuities. The local increase in the expansion order (p-refinement) at areas of high strain rates or vorticity magnitude results in an increase of the order of accuracy in the region of shear layers and vortices. A graph of unitarian-trees, representing hexahedral, prismatic and tetrahedral elements is used for the representation of the initial domain. The ancestral elements of the mesh can be split into self-similar elements allowing each tree to grow branches to an arbitrary level of refinement. The connectivity of the elements, their genealogy and their partitioning are described by linked lists of pointers. An explicit calculation of these relations, presented in this paper, facilitates the on-the-fly splitting, merging and repartitioning of the computational mesh by rearranging the links of each node of the tree with a minimal computational overhead. The modal basis used in the DG implementation facilitates the mapping of the fluxes across the non conformal faces. The AMR methodology is presented and assessed using a series of inviscid and viscous test cases. Also, the AMR methodology is used for the modelling of the interaction between droplets and the carrier phase in a two-phase flow. This approach is applied to the analysis of a spray injected into a chamber of quiescent air, using the Eulerian

  2. Pepsi-SAXS: an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles.

    Science.gov (United States)

    Grudinin, Sergei; Garkavenko, Maria; Kazennov, Andrei

    2017-05-01

    A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist-Shannon-Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion order, this method has the same quadratic dependence on the number of atoms in the model as the Debye-based approach, but with a much smaller prefactor in the computational complexity. The method has been systematically validated on a large set of over 50 models collected from the BioIsis and SASBDB databases. Using a laptop, it was demonstrated that Pepsi-SAXS is about seven, 29 and 36 times faster compared with CRYSOL, FoXS and the three-dimensional Zernike method in SAStbx, respectively, when tested on data from the BioIsis database, and is about five, 21 and 25 times faster compared with CRYSOL, FoXS and SAStbx, respectively, when tested on data from SASBDB. On average, Pepsi-SAXS demonstrates comparable accuracy in terms of χ 2 to CRYSOL and FoXS when tested on BioIsis and SASBDB profiles. Together with a small allowed variation of adjustable parameters, this demonstrates the effectiveness of the method. Pepsi-SAXS is available at http://team.inria.fr/nano-d/software/pepsi-saxs.

  3. Computing Adaptive Feature Weights with PSO to Improve Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2017-01-01

    Full Text Available Android malware detection is a complex and crucial issue. In this paper, we propose a malware detection model using a support vector machine (SVM method based on feature weights that are computed by information gain (IG and particle swarm optimization (PSO algorithms. The IG weights are evaluated based on the relevance between features and class labels, and the PSO weights are adaptively calculated to result in the best fitness (the performance of the SVM classification model. Moreover, to overcome the defects of basic PSO, we propose a new adaptive inertia weight method called fitness-based and chaotic adaptive inertia weight-PSO (FCAIW-PSO that improves on basic PSO and is based on the fitness and a chaotic term. The goal is to assign suitable weights to the features to ensure the best Android malware detection performance. The results of experiments indicate that the IG weights and PSO weights both improve the performance of SVM and that the performance of the PSO weights is better than that of the IG weights.

  4. The EORTC emotional functioning computerized adaptive test

    DEFF Research Database (Denmark)

    Gamper, Eva-Maria; Grønvold, Mogens; Petersen, Morten Aa

    2014-01-01

    The European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Group is currently developing computerized adaptive testing measures for the Quality of Life Questionnaire Core-30 (QLQ-C30) scales. The work presented here describes the development of an EORTC item bank for e...... for emotional functioning (EF), which is one of the core domains of the QLQ-C30....

  5. Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.

    Science.gov (United States)

    Hayashi, Masahito; Morimae, Tomoyuki

    2015-11-27

    We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.

  6. CAT -- computer aided testing for resonant inspection

    International Nuclear Information System (INIS)

    Foley, David K.

    1998-01-01

    Application of computer technology relates to inspection and quality control. The computer aided testing (CAT) can be used to analyze various NDT technologies, such as eddy current, ultrasonics, and resonant inspection

  7. The self-adaptation to dynamic failures for efficient virtual organization formations in grid computing context

    International Nuclear Information System (INIS)

    Han Liangxiu

    2009-01-01

    Grid computing aims to enable 'resource sharing and coordinated problem solving in dynamic, multi-institutional virtual organizations (VOs)'. However, due to the nature of heterogeneous and dynamic resources, dynamic failures in the distributed grid environment usually occur more than in traditional computation platforms, which cause failed VO formations. In this paper, we develop a novel self-adaptive mechanism to dynamic failures during VO formations. Such a self-adaptive scheme allows an individual and member of VOs to automatically find other available or replaceable one once a failure happens and therefore makes systems automatically recover from dynamic failures. We define dynamic failure situations of a system by using two standard indicators: mean time between failures (MTBF) and mean time to recover (MTTR). We model both MTBF and MTTR as Poisson distributions. We investigate and analyze the efficiency of the proposed self-adaptation mechanism to dynamic failures by comparing the success probability of VO formations before and after adopting it in three different cases: (1) different failure situations; (2) different organizational structures and scales; (3) different task complexities. The experimental results show that the proposed scheme can automatically adapt to dynamic failures and effectively improve the dynamic VO formation performance in the event of node failures, which provide a valuable addition to the field.

  8. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items

    Science.gov (United States)

    Aybek, Eren Can; Demirtasli, R. Nukhet

    2017-01-01

    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  9. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  10. A Danish adaptation of the Boston Naming Test

    DEFF Research Database (Denmark)

    Jørgensen, Kasper; Johannsen, Peter; Vogel, Asmus

    2017-01-01

    Objective: The purpose of the present study was to develop a Danish adaptation of the Boston Naming Test (BNT) including a shortened 30-item version of the BNT for routine clinical use and two parallel 15-item versions for screening purposes. Method: The Danish adaptation of the BNT was based...... on ranking of items according to difficulty in a sample of older non-patients (n = 99). By selecting those items with the largest discrepancy in difficulty for non-patients compared to a mild Alzheimer’s disease (AD) sample (n = 53), the shortened versions of the BNT were developed. Using an overlapping...

  11. Practical Considerations about Expected A Posteriori Estimation in Adaptive Testing: Adaptive A Priori, Adaptive Correction for Bias, and Adaptive Integration Interval.

    Science.gov (United States)

    Raiche, Gilles; Blais, Jean-Guy

    In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…

  12. Adaptation of spatial navigation tests to virtual reality.

    OpenAIRE

    Šupalová, Ivana

    2009-01-01

    At the Department of Neurophysiology of Memory in the Academy of Sciences in Czech Republic are recently performed tests of spatial navigation of people in experimental real enviroment called Blue Velvet Arena. In introdution of this thesis is described importancy of these tests for medical purposes and the recent solution. The main aim is to adapt this real enviroment to virtual reality, allow it's configuration and enable to collect data retieved during experiment's execution. Resulting sys...

  13. Using An Adapter To Perform The Chalfant-Style Containment Vessel Periodic Maintenance Leak Rate Test

    International Nuclear Information System (INIS)

    Loftin, B.; Abramczyk, G.; Trapp, D.

    2011-01-01

    Recently the Packaging Technology and Pressurized Systems (PT and PS) organization at the Savannah River National Laboratory was asked to develop an adapter for performing the leak-rate test of a Chalfant-style containment vessel. The PT and PS organization collaborated with designers at the Department of Energy's Pantex Plant to develop the adapter currently in use for performing the leak-rate testing on the containment vessels. This paper will give the history of leak-rate testing of the Chalfant-style containment vessels, discuss the design concept for the adapter, give an overview of the design, and will present results of the testing done using the adapter.

  14. High-Resolution Adaptive Optics Test-Bed for Vision Science

    International Nuclear Information System (INIS)

    Wilks, S.C.; Thomspon, C.A.; Olivier, S.S.; Bauman, B.J.; Barnes, T.; Werner, J.S.

    2001-01-01

    We discuss the design and implementation of a low-cost, high-resolution adaptive optics test-bed for vision research. It is well known that high-order aberrations in the human eye reduce optical resolution and limit visual acuity. However, the effects of aberration-free eyesight on vision are only now beginning to be studied using adaptive optics to sense and correct the aberrations in the eye. We are developing a high-resolution adaptive optics system for this purpose using a Hamamatsu Parallel Aligned Nematic Liquid Crystal Spatial Light Modulator. Phase-wrapping is used to extend the effective stroke of the device, and the wavefront sensing and wavefront correction are done at different wavelengths. Issues associated with these techniques will be discussed

  15. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    Science.gov (United States)

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese.

    Science.gov (United States)

    Rossi, Natalia Freitas; Lindau, Tâmara de Andrade; Gillam, Ronald Bradley; Giacheti, Célia Maria

    To accomplish the translation and cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese. The TNL is a formal instrument which assesses narrative comprehension and oral narration of children between the ages of 5-0 and 11-11 (years-months). The TNL translation and adaptation process had the following steps: (1) translation into the target language; (2) summary of the translated versions; (3) back-translation; (4) checking of the conceptual, semantics and cultural equivalence process and (5) pilot study (56 children within the test age range and from both genders). The adapted version maintained the same structure as the original version: number of tasks (both, three comprehension and oral narration), narrative formats (no picture, sequenced pictures and single picture) and scoring system. There were no adjustments to the pictures. The "McDonald's Story" was replaced by the "Snack Bar History" to meet the semantic and experiential equivalence of the target population. The other stories had semantic and grammatical adjustments. Statistically significant difference was found when comparing the raw score (comprehension, narration and total) of age groups from the adapted version. Adjustments were required to meet the equivalence between the original and the translated versions. The adapted version showed it has the potential to identify differences in oral narratives of children in the age range provided by the test. Measurement equivalence for validation and test standardization are in progress and will be able to supplement the study outcomes.

  17. ICAN Computer Code Adapted for Building Materials

    Science.gov (United States)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  18. Multidimensional Computerized Adaptive Testing for Indonesia Junior High School Biology

    Science.gov (United States)

    Kuo, Bor-Chen; Daud, Muslem; Yang, Chih-Wei

    2015-01-01

    This paper describes a curriculum-based multidimensional computerized adaptive test that was developed for Indonesia junior high school Biology. In adherence to the Indonesian curriculum of different Biology dimensions, 300 items was constructed, and then tested to 2238 students. A multidimensional random coefficients multinomial logit model was…

  19. Qualitative interviews with healthcare staff in four European countries to inform adaptation of an intervention to increase chlamydia testing.

    Science.gov (United States)

    McNulty, Cliodna; Ricketts, Ellie J; Fredlund, Hans; Uusküla, Anneli; Town, Katy; Rugman, Claire; Tisler-Sala, Anna; Mani, Alix; Dunais, Brigitte; Folkard, Kate; Allison, Rosalie; Touboul, Pia

    2017-09-25

    To determine the needs of primary healthcare general practice (GP) staff, stakeholders and trainers to inform the adaptation of a locally successful complex intervention (Chlamydia Intervention Randomised Trial (CIRT)) aimed at increasing chlamydia testing within primary healthcare within South West England to three EU countries (Estonia, France and Sweden) and throughout England. Qualitative interviews. European primary healthcare in England, France, Sweden and Estonia with a range of chlamydia screening provision in 2013. 45 GP staff, 13 trainers and 18 stakeholders. The iterative interview schedule explored participants' personal attitudes, subjective norms and perceived behavioural controls around provision of chlamydia testing, sexual health services and training in general practice. Researchers used a common thematic analysis. Findings were similar across all countries. Most participants agreed that chlamydia testing and sexual health services should be offered in general practice. There was no culture of GP staff routinely offering opportunistic chlamydia testing or sexual health advice, and due to other priorities, participants reported this would be challenging. All participants indicated that the CIRT workshop covering chlamydia testing and sexual health would be useful if practice based, included all practice staff and action planning, and was adequately resourced. Participants suggested minor adaptations to CIRT to suit their country's health services. A common complex intervention can be adapted for use across Europe, despite varied sexual health provision. The intervention (ChlamydiA Testing Training in Europe (CATTE)) should comprise: a staff workshop covering sexual health and chlamydia testing rates and procedures, action planning and patient materials and staff reminders via computer prompts, emails or newsletters, with testing feedback through practice champions. CATTE materials are available at: www.STItraining.eu. © Article author(s) (or their

  20. Adaptive control of Parkinson's state based on a nonlinear computational model with unknown parameters.

    Science.gov (United States)

    Su, Fei; Wang, Jiang; Deng, Bin; Wei, Xi-Le; Chen, Ying-Yuan; Liu, Chen; Li, Hui-Yan

    2015-02-01

    The objective here is to explore the use of adaptive input-output feedback linearization method to achieve an improved deep brain stimulation (DBS) algorithm for closed-loop control of Parkinson's state. The control law is based on a highly nonlinear computational model of Parkinson's disease (PD) with unknown parameters. The restoration of thalamic relay reliability is formulated as the desired outcome of the adaptive control methodology, and the DBS waveform is the control input. The control input is adjusted in real time according to estimates of unknown parameters as well as the feedback signal. Simulation results show that the proposed adaptive control algorithm succeeds in restoring the relay reliability of the thalamus, and at the same time achieves accurate estimation of unknown parameters. Our findings point to the potential value of adaptive control approach that could be used to regulate DBS waveform in more effective treatment of PD.

  1. On Adaptive vs. Non-adaptive Security of Multiparty Protocols

    DEFF Research Database (Denmark)

    Canetti, Ran; Damgård, Ivan Bjerre; Dziembowski, Stefan

    2001-01-01

    highlights of our results are: – - According to the definition of Dodis-Micali-Rogaway (which is set in the information-theoretic model), adaptive and non-adaptive security are equivalent. This holds for both honest-but-curious and Byzantine adversaries, and for any number of parties. – - According......Security analysis of multiparty cryptographic protocols distinguishes between two types of adversarialsettings: In the non-adaptive setting, the set of corrupted parties is chosen in advance, before the interaction begins. In the adaptive setting, the adversary chooses who to corrupt during...... the course of the computation. We study the relations between adaptive security (i.e., security in the adaptive setting) and non-adaptive security, according to two definitions and in several models of computation. While affirming some prevailing beliefs, we also obtain some unexpected results. Some...

  2. Highly adaptive tests for group differences in brain functional connectivity

    Directory of Open Access Journals (Sweden)

    Junghi Kim

    2015-01-01

    The proposed tests combine statistical evidence against a null hypothesis from multiple sources across a range of plausible tuning parameter values reflecting uncertainty with the unknown truth. These highly adaptive tests are not only easy to use, but also high-powered robustly across various scenarios. The usage and advantages of these novel tests are demonstrated on an Alzheimer's disease dataset and simulated data.

  3. Testing for local adaptation in brown trout using reciprocal transplants

    Directory of Open Access Journals (Sweden)

    Stelkens Rike B

    2012-12-01

    Full Text Available Abstract Background Local adaptation can drive the divergence of populations but identification of the traits under selection remains a major challenge in evolutionary biology. Reciprocal transplant experiments are ideal tests of local adaptation, yet rarely used for higher vertebrates because of the mobility and potential invasiveness of non-native organisms. Here, we reciprocally transplanted 2500 brown trout (Salmo trutta embryos from five populations to investigate local adaptation in early life history traits. Embryos were bred in a full-factorial design and raised in natural riverbeds until emergence. Customized egg capsules were used to simulate the natural redd environment and allowed tracking the fate of every individual until retrieval. We predicted that 1 within sites, native populations would outperform non-natives, and 2 across sites, populations would show higher performance at ‘home’ compared to ‘away’ sites. Results There was no evidence for local adaptation but we found large differences in survival and hatching rates between sites, indicative of considerable variation in habitat quality. Survival was generally high across all populations (55% ± 3%, but ranged from 4% to 89% between sites. Average hatching rate was 25% ± 3% across populations ranging from 0% to 62% between sites. Conclusion This study provides rare empirical data on variation in early life history traits in a population network of a salmonid, and large-scale breeding and transplantation experiments like ours provide powerful tests for local adaptation. Despite the recently reported genetic and morphological differences between the populations in our study area, local adaptation at the embryo level is small, non-existent, or confined to ecological conditions that our experiment could not capture.

  4. Adapting and Pilot Testing a Parenting Intervention for Homeless Families in Transitional Housing.

    Science.gov (United States)

    Holtrop, Kendal; Holcomb, Jamila E

    2018-01-24

    Intervention adaptation is a promising approach for extending the reach of evidence-based interventions to underserved families. One highly relevant population in need of services are homeless families. In particular, homeless families with children constitute more than one third of the total homeless population in the United States and face several unique challenges to parenting. The purpose of this study was to adapt and pilot test a parenting intervention for homeless families in transitional housing. An established adaptation model was used to guide this process. The systematic adaptation efforts included: (a) examining the theory of change in the original intervention, (b) identifying population differences relevant to homeless families in transitional housing, (c) adapting the content of the intervention, and (d) adapting the evaluation strategy. Next, a pilot test of the adapted intervention was conducted to examine implementation feasibility and acceptability. Feasibility data indicate an intervention spanning several weeks may be difficult to implement in the context of transitional housing. Yet, acceptability of the adapted intervention among participants was consistently high. The findings of this pilot work suggest several implications for informing continued parenting intervention research and practice with homeless families in transitional housing. © 2018 Family Process Institute.

  5. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    International Nuclear Information System (INIS)

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Soummer, Remi; Macintosh, Bruce; Sivaramakrishnan, Anand

    2011-01-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  6. Adaptive control in multi-threaded iterated integration

    International Nuclear Information System (INIS)

    Doncker, Elise de; Yuasa, Fukuko

    2013-01-01

    In recent years we have developed a technique for the direct computation of Feynman loop-integrals, which are notorious for the occurrence of integrand singularities. Especially for handling singularities in the interior of the domain, we approximate the iterated integral using an adaptive algorithm in the coordinate directions. We present a novel multi-core parallelization scheme for adaptive multivariate integration, by assigning threads to the rule evaluations in the outer dimensions of the iterated integral. The method ensures a large parallel granularity as each function evaluation by itself comprises an integral over the lower dimensions, while the application of the threads is governed by the adaptive control in the outer level. We give computational results for a test set of 3- to 6-dimensional integrals, where several problems exhibit a loop integral behavior.

  7. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  8. Benchmark tests and spin adaptation for the particle-particle random phase approximation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yang; Steinmann, Stephan N.; Peng, Degao [Department of Chemistry, Duke University, Durham, North Carolina 27708 (United States); Aggelen, Helen van, E-mail: Helen.VanAggelen@UGent.be [Department of Chemistry, Duke University, Durham, North Carolina 27708 (United States); Department of Inorganic and Physical Chemistry, Ghent University, 9000 Ghent (Belgium); Yang, Weitao, E-mail: Weitao.Yang@duke.edu [Department of Chemistry and Department of Physics, Duke University, Durham, North Carolina 27708 (United States)

    2013-11-07

    The particle-particle random phase approximation (pp-RPA) provides an approximation to the correlation energy in density functional theory via the adiabatic connection [H. van Aggelen, Y. Yang, and W. Yang, Phys. Rev. A 88, 030501 (2013)]. It has virtually no delocalization error nor static correlation error for single-bond systems. However, with its formal O(N{sup 6}) scaling, the pp-RPA is computationally expensive. In this paper, we implement a spin-separated and spin-adapted pp-RPA algorithm, which reduces the computational cost by a substantial factor. We then perform benchmark tests on the G2/97 enthalpies of formation database, DBH24 reaction barrier database, and four test sets for non-bonded interactions (HB6/04, CT7/04, DI6/04, and WI9/04). For the G2/97 database, the pp-RPA gives a significantly smaller mean absolute error (8.3 kcal/mol) than the direct particle-hole RPA (ph-RPA) (22.7 kcal/mol). Furthermore, the error in the pp-RPA is nearly constant with the number of atoms in a molecule, while the error in the ph-RPA increases. For chemical reactions involving typical organic closed-shell molecules, pp- and ph-RPA both give accurate reaction energies. Similarly, both RPAs perform well for reaction barriers and nonbonded interactions. These results suggest that the pp-RPA gives reliable energies in chemical applications. The adiabatic connection formalism based on pairing matrix fluctuation is therefore expected to lead to widely applicable and accurate density functionals.

  9. Modeling UV Radiation Feedback from Massive Stars. I. Implementation of Adaptive Ray-tracing Method and Tests

    Science.gov (United States)

    Kim, Jeong-Gyu; Kim, Woong-Tae; Ostriker, Eve C.; Skinner, M. Aaron

    2017-12-01

    We present an implementation of an adaptive ray-tracing (ART) module in the Athena hydrodynamics code that accurately and efficiently handles the radiative transfer involving multiple point sources on a three-dimensional Cartesian grid. We adopt a recently proposed parallel algorithm that uses nonblocking, asynchronous MPI communications to accelerate transport of rays across the computational domain. We validate our implementation through several standard test problems, including the propagation of radiation in vacuum and the expansions of various types of H II regions. Additionally, scaling tests show that the cost of a full ray trace per source remains comparable to that of the hydrodynamics update on up to ∼ {10}3 processors. To demonstrate application of our ART implementation, we perform a simulation of star cluster formation in a marginally bound, turbulent cloud, finding that its star formation efficiency is 12% when both radiation pressure forces and photoionization by UV radiation are treated. We directly compare the radiation forces computed from the ART scheme with those from the M 1 closure relation. Although the ART and M 1 schemes yield similar results on large scales, the latter is unable to resolve the radiation field accurately near individual point sources.

  10. Cone Beam Computed Tomography-Derived Adaptive Radiotherapy for Radical Treatment of Esophageal Cancer

    International Nuclear Information System (INIS)

    Hawkins, Maria A.; Brooks, Corrinne; Hansen, Vibeke N.; Aitken, Alexandra; Tait, Diana M.

    2010-01-01

    Purpose: To investigate the potential for reduction in normal tissue irradiation by creating a patient specific planning target volume (PTV) using cone beam computed tomography (CBCT) imaging acquired in the first week of radiotherapy for patients receiving radical radiotherapy. Methods and materials: Patients receiving radical RT for carcinoma of the esophagus were investigated. The PTV is defined as CTV(tumor, nodes) plus esophagus outlined 3 to 5 cm cranio-caudally and a 1.5-cm circumferential margin is added (clinical plan). Prefraction CBCT are acquired on Days 1 to 4, then weekly. No correction for setup error made. The images are imported into the planning system. The tumor and esophagus for the length of the PTV are contoured on each CBCT and 5 mm margin is added. A composite volume (PTV1) is created using Week 1 composite CBCT volumes. The same process is repeated using CBCT Week 2 to 6 (PTV2). A new plan is created using PTV1 (adaptive plan). The coverage of the 95% isodose of PTV1 is evaluated on PTV2. Dose-volume histograms (DVH) for lungs, heart, and cord for two plans are compared. Results: A total of 139 CBCT for 14 cases were analyzed. For the adaptive plan the coverage of the 95% prescription isodose for PTV1 = 95.6% ± 4% and the PTV2 = 96.8% ± 4.1% (t test, 0.19). Lungs V20 (15.6 Gy vs. 10.2 Gy) and heart mean dose (26.9 Gy vs. 20.7 Gy) were significantly smaller for the adaptive plan. Conclusions: A reduced planning volume can be constructed within the first week of treatment using CBCT. A single plan modification can be performed within the second week of treatment with considerable reduction in organ at risk dose.

  11. Construct validity of the pediatric evaluation of disability inventory computer adaptive test (PEDI-CAT) in children with medical complexity.

    Science.gov (United States)

    Dumas, Helene M; Fragala-Pinkham, Maria A; Rosen, Elaine L; O'Brien, Jane E

    2017-11-01

    To assess construct (convergent and divergent) validity of the Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) in a sample of children with complex medical conditions. Demographics, clinical information, PEDI-CAT normative score, and the Post-Acute Acuity Rating for Children (PAARC) level were collected for all post-acute hospital admissions (n = 110) from 1 April 2015 to 1 March 2016. Correlations between the PEDI-CAT Daily Activities, Mobility, and Social/Cognitive domain scores for the total sample and across three age groups (infant, preschool, and school-age) were calculated. Differences in mean PEDI-CAT scores for each domain across two groups, children with "Less Complexity," or "More Complexity" based on PAARC level were examined. All correlations for the total sample and age subgroups were statistically significant and trends across age groups were evident with the stronger associations between domains for the infant group. Significant differences were found between mean PEDI-CAT Daily Activities, Mobility, and Social/Cognitive normative scores across the two complexity groups with children in the "Less Complex" group having higher PEDI-CAT scores for all domains. This study provides evidence indicating the PEDI-CAT can be used with confidence in capturing and differentiating children's level of function in a post-acute care setting. Implications for Rehabilitation The PEDI-CAT is measure of function for children with a variety of conditions and can be used in any clinical setting. Convergent validity of the PEDI-CAT's Daily Activities, Mobility, and Social/Cognitive domains was significant and particularly strong for infants and young children with medical complexity. The PEDI-CAT was able to discriminate groups of children with differing levels of medical complexity admitted to a pediatric post-acute care hospital.

  12. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  13. Multiple Maximum Exposure Rates in Computerized Adaptive Testing

    Science.gov (United States)

    Ramon Barrada, Juan; Veldkamp, Bernard P.; Olea, Julio

    2009-01-01

    Computerized adaptive testing is subject to security problems, as the item bank content remains operative over long periods and administration time is flexible for examinees. Spreading the content of a part of the item bank could lead to an overestimation of the examinees' trait level. The most common way of reducing this risk is to impose a…

  14. QUASI-RANDOM TESTING OF COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    S. V. Yarmolik

    2013-01-01

    Full Text Available Various modified random testing approaches have been proposed for computer system testing in the black box environment. Their effectiveness has been evaluated on the typical failure patterns by employing three measures, namely, P-measure, E-measure and F-measure. A quasi-random testing, being a modified version of the random testing, has been proposed and analyzed. The quasi-random Sobol sequences and modified Sobol sequences are used as the test patterns. Some new methods for Sobol sequence generation have been proposed and analyzed.

  15. Adapting tests of sign language assessment for other sign languages--a review of linguistic, cultural, and psychometric problems.

    Science.gov (United States)

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from one natural sign language to another. Two tests which have been adapted for several other sign languages are focused upon: the Test for American Sign Language and the British Sign Language Receptive Skills Test. A brief description is given of each test as well as insights from ongoing adaptations of these tests for other sign languages. The problems reported in these adaptations were found to be grounded in linguistic and cultural differences, which need to be considered for future test adaptations. Other reported shortcomings of test adaptation are related to the question of how well psychometric measures transfer from one instrument to another.

  16. Adaptive security systems -- Combining expert systems with adaptive technologies

    International Nuclear Information System (INIS)

    Argo, P.; Loveland, R.; Anderson, K.

    1997-01-01

    The Adaptive Multisensor Integrated Security System (AMISS) uses a variety of computational intelligence techniques to reason from raw sensor data through an array of processing layers to arrive at an assessment for alarm/alert conditions based on human behavior within a secure facility. In this paper, the authors give an overview of the system and briefly describe some of the major components of the system. This system is currently under development and testing in a realistic facility setting

  17. Adaptation of MPDATA Heterogeneous Stencil Computation to Intel Xeon Phi Coprocessor

    Directory of Open Access Journals (Sweden)

    Lukasz Szustak

    2015-01-01

    Full Text Available The multidimensional positive definite advection transport algorithm (MPDATA belongs to the group of nonoscillatory forward-in-time algorithms and performs a sequence of stencil computations. MPDATA is one of the major parts of the dynamic core of the EULAG geophysical model. In this work, we outline an approach to adaptation of the 3D MPDATA algorithm to the Intel MIC architecture. In order to utilize available computing resources, we propose the (3 + 1D decomposition of MPDATA heterogeneous stencil computations. This approach is based on combination of the loop tiling and fusion techniques. It allows us to ease memory/communication bounds and better exploit the theoretical floating point efficiency of target computing platforms. An important method of improving the efficiency of the (3 + 1D decomposition is partitioning of available cores/threads into work teams. It permits for reducing inter-cache communication overheads. This method also increases opportunities for the efficient distribution of MPDATA computation onto available resources of the Intel MIC architecture, as well as Intel CPUs. We discuss preliminary performance results obtained on two hybrid platforms, containing two CPUs and Intel Xeon Phi. The top-of-the-line Intel Xeon Phi 7120P gives the best performance results, and executes MPDATA almost 2 times faster than two Intel Xeon E5-2697v2 CPUs.

  18. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  19. Recommendations for elaboration, transcultural adaptation and validation process of tests in Speech, Hearing and Language Pathology.

    Science.gov (United States)

    Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de

    2017-06-08

    to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.

  20. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    Directory of Open Access Journals (Sweden)

    Laura Acqualagna

    Full Text Available In the last years Brain Computer Interface (BCI technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods.

  1. A Look at Computer-Assisted Testing Operations. The Illinois Series on Educational Application of Computers, No. 12e.

    Science.gov (United States)

    Muiznieks, Viktors; Dennis, J. Richard

    In computer assisted test construction (CATC) systems, the computer is used to perform the mechanical aspects of testing while the teacher retains control over question content. Advantages of CATC systems include question banks, decreased importance of test item security, computer analysis and response to student test answers, item analysis…

  2. Comparison of tests of accommodation for computer users.

    Science.gov (United States)

    Kolker, David; Hutchinson, Robert; Nilsen, Erik

    2002-04-01

    With the increased use of computers in the workplace and at home, optometrists are finding more patients presenting with symptoms of Computer Vision Syndrome. Among these symptomatic individuals, research supports that accommodative disorders are the most common vision finding. A prepresbyopic group (N= 30) and a presbyopic group (N = 30) were selected from a private practice. Assignment to a group was determined by age, accommodative amplitude, and near visual acuity with their distance prescription. Each subject was given a thorough vision and ocular health examination, then administered several nearpoint tests of accommodation at a computer working distance. All the tests produced similar results in the presbyopic group. For the prepresbyopic group, the tests yielded very different results. To effectively treat symptomatic VDT users, optometrists must assess the accommodative system along with the binocular and refractive status. For presbyopic patients, all nearpoint tests studied will yield virtually the same result. However, the method of testing accommodation, as well as the test stimulus presented, will yield significantly different responses for prepresbyopic patients. Previous research indicates that a majority of patients prefer the higher plus prescription yielded by the Gaussian image test.

  3. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    Science.gov (United States)

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  4. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  5. Research on the Random Shock Vibration Test Based on the Filter-X LMS Adaptive Inverse Control Algorithm

    Directory of Open Access Journals (Sweden)

    Wang Wei

    2016-01-01

    Full Text Available The related theory and algorithm of adaptive inverse control were presented through the research which pointed out the adaptive inverse control strategy could effectively eliminate the noise influence on the system control. Proposed using a frequency domain filter-X LMS adaptive inverse control algorithm, and the control algorithm was applied to the two-exciter hydraulic vibration test system of random shock vibration control process and summarized the process of the adaptive inverse control strategies in the realization of the random shock vibration test. The self-closed-loop and field test show that using the frequency-domain filter-X LMS adaptive inverse control algorithm can realize high precision control of random shock vibration test.

  6. Smartphone adapters for digital photomicrography.

    Science.gov (United States)

    Roy, Somak; Pantanowitz, Liron; Amin, Milon; Seethala, Raja R; Ishtiaque, Ahmed; Yousem, Samuel A; Parwani, Anil V; Cucoranu, Ioan; Hartman, Douglas J

    2014-01-01

    Photomicrographs in Anatomic Pathology provide a means of quickly sharing information from a glass slide for consultation, education, documentation and publication. While static image acquisition historically involved the use of a permanently mounted camera unit on a microscope, such cameras may be expensive, need to be connected to a computer, and often require proprietary software to acquire and process images. Another novel approach for capturing digital microscopic images is to use smartphones coupled with the eyepiece of a microscope. Recently, several smartphone adapters have emerged that allow users to attach mobile phones to the microscope. The aim of this study was to test the utility of these various smartphone adapters. We surveyed the market for adapters to attach smartphones to the ocular lens of a conventional light microscope. Three adapters (Magnifi, Skylight and Snapzoom) were tested. We assessed the designs of these adapters and their effectiveness at acquiring static microscopic digital images. All adapters facilitated the acquisition of digital microscopic images with a smartphone. The optimal adapter was dependent on the type of phone used. The Magnifi adapters for iPhone were incompatible when using a protective case. The Snapzoom adapter was easiest to use with iPhones and other smartphones even with protective cases. Smartphone adapters are inexpensive and easy to use for acquiring digital microscopic images. However, they require some adjustment by the user in order to optimize focus and obtain good quality images. Smartphone microscope adapters provide an economically feasible method of acquiring and sharing digital pathology photomicrographs.

  7. Simulation Research on Adaptive Control of a Six-degree-of-freedom Material-testing Machine

    Directory of Open Access Journals (Sweden)

    Dan Wang

    2014-02-01

    Full Text Available This paper presents an adaptive controller equipped with a stiffness estimation method for a novel material-testing machine, in order to alleviate the performance depression caused by the stiffness variance of the tested specimen. The dynamic model of the proposed machine is built using the Kane method, and kinematic model is established with a closed-form solution. The stiffness estimation method is developed based on the recursive least-squares method and the proposed stiffness equivalent matrix. Control performances of the adaptive controller are simulated in detail. The simulation results illustrate that the proposed controller can greatly improve the control performance of the target material-testing machine by online stiffness estimation and adaptive parameter tuning, especially in low-cycle fatigue (LCF and high-cycle fatigue (HCF tests.

  8. Comparison of Rigid and Adaptive Methods of Propagating Gross Tumor Volume Through Respiratory Phases of Four-Dimensional Computed Tomography Image Data Set

    International Nuclear Information System (INIS)

    Ezhil, Muthuveni; Choi, Bum; Starkschall, George; Bucci, M. Kara; Vedam, Sastry; Balter, Peter

    2008-01-01

    Purpose: To compare three different methods of propagating the gross tumor volume (GTV) through the respiratory phases that constitute a four-dimensional computed tomography image data set. Methods and Materials: Four-dimensional computed tomography data sets of 20 patients who had undergone definitive hypofractionated radiotherapy to the lung were acquired. The GTV regions of interest (ROIs) were manually delineated on each phase of the four-dimensional computed tomography data set. The ROI from the end-expiration phase was propagated to the remaining nine phases of respiration using the following three techniques: (1) rigid-image registration using in-house software, (2) rigid image registration using research software from a commercial radiotherapy planning system vendor, and (3) rigid-image registration followed by deformable adaptation originally intended for organ-at-risk delineation using the same software. The internal GTVs generated from the various propagation methods were compared with the manual internal GTV using the normalized Dice similarity coefficient (DSC) index. Results: The normalized DSC index of 1.01 ± 0.06 (SD) for rigid propagation using the in-house software program was identical to the normalized DSC index of 1.01 ± 0.06 for rigid propagation achieved with the vendor's research software. Adaptive propagation yielded poorer results, with a normalized DSC index of 0.89 ± 0.10 (paired t test, p <0.001). Conclusion: Propagation of the GTV ROIs through the respiratory phases using rigid- body registration is an acceptable method within a 1-mm margin of uncertainty. The adaptive organ-at-risk propagation method was not applicable to propagating GTV ROIs, resulting in an unacceptable reduction of the volume and distortion of the ROIs

  9. Adaptive tight frame based medical image reconstruction: a proof-of-concept study for computed tomography

    International Nuclear Information System (INIS)

    Zhou, Weifeng; Cai, Jian-Feng; Gao, Hao

    2013-01-01

    A popular approach for medical image reconstruction has been through the sparsity regularization, assuming the targeted image can be well approximated by sparse coefficients under some properly designed system. The wavelet tight frame is such a widely used system due to its capability for sparsely approximating piecewise-smooth functions, such as medical images. However, using a fixed system may not always be optimal for reconstructing a variety of diversified images. Recently, the method based on the adaptive over-complete dictionary that is specific to structures of the targeted images has demonstrated its superiority for image processing. This work is to develop the adaptive wavelet tight frame method image reconstruction. The proposed scheme first constructs the adaptive wavelet tight frame that is task specific, and then reconstructs the image of interest by solving an l 1 -regularized minimization problem using the constructed adaptive tight frame system. The proof-of-concept study is performed for computed tomography (CT), and the simulation results suggest that the adaptive tight frame method improves the reconstructed CT image quality from the traditional tight frame method. (paper)

  10. Automatic Delineation of On-Line Head-And-Neck Computed Tomography Images: Toward On-Line Adaptive Radiotherapy

    International Nuclear Information System (INIS)

    Zhang Tiezhi; Chi Yuwei; Meldolesi, Elisa; Yan Di

    2007-01-01

    Purpose: To develop and validate a fully automatic region-of-interest (ROI) delineation method for on-line adaptive radiotherapy. Methods and Materials: On-line adaptive radiotherapy requires a robust and automatic image segmentation method to delineate ROIs in on-line volumetric images. We have implemented an atlas-based image segmentation method to automatically delineate ROIs of head-and-neck helical computed tomography images. A total of 32 daily computed tomography images from 7 head-and-neck patients were delineated using this automatic image segmentation method. Manually drawn contours on the daily images were used as references in the evaluation of automatically delineated ROIs. Two methods were used in quantitative validation: (1) the dice similarity coefficient index, which indicates the overlapping ratio between the manually and automatically delineated ROIs; and (2) the distance transformation, which yields the distances between the manually and automatically delineated ROI surfaces. Results: Automatic segmentation showed agreement with manual contouring. For most ROIs, the dice similarity coefficient indexes were approximately 0.8. Similarly, the distance transformation evaluation results showed that the distances between the manually and automatically delineated ROI surfaces were mostly within 3 mm. The distances between two surfaces had a mean of 1 mm and standard deviation of <2 mm in most ROIs. Conclusion: With atlas-based image segmentation, it is feasible to automatically delineate ROIs on the head-and-neck helical computed tomography images in on-line adaptive treatments

  11. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  12. Laboratory tests for assessing adaptability and stickiness of dental composites.

    Science.gov (United States)

    Rosentritt, Martin; Buczovsky, Sebastian; Behr, Michael; Preis, Verena

    2014-09-01

    Handling (stickiness, adaptability) of a dental composite does strongly influence quality and success of a dental restoration. The purpose was to develop an in vitro test, which allows for evaluating adaptability and stickiness. 15 dentists were asked for providing individual assessment (school scores 1-6) of five dental composites addressing adaptability and stickiness. Composites were applied with a dental plugger (d=1.8 mm) in a class I cavity (human tooth 17). The tooth was fixed on a force gauge for simultaneous determination of application forces with varying storage (6/25°C) and application temperatures (6/25°C). On basis of these data tensile tests were performed with a dental plugger (application force 1N/2N; v=35 mm/min) on PMMA- or human tooth plates. Composite was dosed onto the tip of the plugger and applied. Application and unplugging was performed once and unplugging forces (UF) and length of the adhesive flags (LAF) were determined at different storage (6/25°C) and application temperatures (25/37°C). Unplugging work (UW) was calculated from area of UF and LAF data. The individual assessment revealed significantly different temperature-dependent application forces between 0.58 N and 2.23 N. Adaptability was assessed between 2.1 and 2.8 school scores. Stickiness varied significantly between the materials (scores: 2-3.2). UW differed significantly between the materials with values between 3.20 N mm and 37.83 N mm. Between PMMA substrate or tooth slides and between 1N or 2N application force only small UW differences were found. The presented in vitro unplugging work allows for an in vitro estimation of the handling parameters adaptability and stickiness. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  13. A Flexible Computational Framework Using R and Map-Reduce for Permutation Tests of Massive Genetic Analysis of Complex Traits.

    Science.gov (United States)

    Mahjani, Behrang; Toor, Salman; Nettelblad, Carl; Holmgren, Sverker

    2017-01-01

    In quantitative trait locus (QTL) mapping significance of putative QTL is often determined using permutation testing. The computational needs to calculate the significance level are immense, 10 4 up to 10 8 or even more permutations can be needed. We have previously introduced the PruneDIRECT algorithm for multiple QTL scan with epistatic interactions. This algorithm has specific strengths for permutation testing. Here, we present a flexible, parallel computing framework for identifying multiple interacting QTL using the PruneDIRECT algorithm which uses the map-reduce model as implemented in Hadoop. The framework is implemented in R, a widely used software tool among geneticists. This enables users to rearrange algorithmic steps to adapt genetic models, search algorithms, and parallelization steps to their needs in a flexible way. Our work underlines the maturity of accessing distributed parallel computing for computationally demanding bioinformatics applications through building workflows within existing scientific environments. We investigate the PruneDIRECT algorithm, comparing its performance to exhaustive search and DIRECT algorithm using our framework on a public cloud resource. We find that PruneDIRECT is vastly superior for permutation testing, and perform 2 ×10 5 permutations for a 2D QTL problem in 15 hours, using 100 cloud processes. We show that our framework scales out almost linearly for a 3D QTL search.

  14. A new adaptive testing algorithm for shortening health literacy assessments

    Directory of Open Access Journals (Sweden)

    Currie Leanne M

    2011-08-01

    Full Text Available Abstract Background Low health literacy has a detrimental effect on health outcomes, as well as ability to use online health resources. Good health literacy assessment tools must be brief to be adopted in practice; test development from the perspective of item-response theory requires pretesting on large participant populations. Our objective was to develop a novel classification method for developing brief assessment instruments that does not require pretesting on large numbers of research participants, and that would be suitable for computerized adaptive testing. Methods We present a new algorithm that uses principles of measurement decision theory (MDT and Shannon's information theory. As a demonstration, we applied it to a secondary analysis of data sets from two assessment tests: a study that measured patients' familiarity with health terms (52 participants, 60 items and a study that assessed health numeracy (165 participants, 8 items. Results In the familiarity data set, the method correctly classified 88.5% of the subjects, and the average length of test was reduced by about 50%. In the numeracy data set, for a two-class classification scheme, 96.9% of the subjects were correctly classified with a more modest reduction in test length of 35.7%; a three-class scheme correctly classified 93.8% with a 17.7% reduction in test length. Conclusions MDT-based approaches are a promising alternative to approaches based on item-response theory, and are well-suited for computerized adaptive testing in the health domain.

  15. Capabilities of wind tunnels with two-adaptive walls to minimize boundary interference in 3-D model testing

    Science.gov (United States)

    Rebstock, Rainer; Lee, Edwin E., Jr.

    1989-01-01

    An initial wind tunnel test was made to validate a new wall adaptation method for 3-D models in test sections with two adaptive walls. First part of the adaptation strategy is an on-line assessment of wall interference at the model position. The wall induced blockage was very small at all test conditions. Lift interference occurred at higher angles of attack with the walls set aerodynamically straight. The adaptation of the top and bottom tunnel walls is aimed at achieving a correctable flow condition. The blockage was virtually zero throughout the wing planform after the wall adjustment. The lift curve measured with the walls adapted agreed very well with interference free data for Mach 0.7, regardless of the vertical position of the wing in the test section. The 2-D wall adaptation can significantly improve the correctability of 3-D model data. Nevertheless, residual spanwise variations of wall interference are inevitable.

  16. An adaptive multi-spline refinement algorithm in simulation based sailboat trajectory optimization using onboard multi-core computer systems

    Directory of Open Access Journals (Sweden)

    Dębski Roman

    2016-06-01

    Full Text Available A new dynamic programming based parallel algorithm adapted to on-board heterogeneous computers for simulation based trajectory optimization is studied in the context of “high-performance sailing”. The algorithm uses a new discrete space of continuously differentiable functions called the multi-splines as its search space representation. A basic version of the algorithm is presented in detail (pseudo-code, time and space complexity, search space auto-adaptation properties. Possible extensions of the basic algorithm are also described. The presented experimental results show that contemporary heterogeneous on-board computers can be effectively used for solving simulation based trajectory optimization problems. These computers can be considered micro high performance computing (HPC platforms-they offer high performance while remaining energy and cost efficient. The simulation based approach can potentially give highly accurate results since the mathematical model that the simulator is built upon may be as complex as required. The approach described is applicable to many trajectory optimization problems due to its black-box represented performance measure and use of OpenCL.

  17. Computer-Based English Language Testing in China: Present and Future

    Science.gov (United States)

    Yu, Guoxing; Zhang, Jing

    2017-01-01

    In this special issue on high-stakes English language testing in China, the two articles on computer-based testing (Jin & Yan; He & Min) highlight a number of consistent, ongoing challenges and concerns in the development and implementation of the nationwide IB-CET (Internet Based College English Test) and institutional computer-adaptive…

  18. Effect of Preparation Depth on the Marginal and Internal Adaptation of Computer-aided Design/Computer-assisted Manufacture Endocrowns.

    Science.gov (United States)

    Gaintantzopoulou, M D; El-Damanhoury, H M

    The aim of the study was to evaluate the effect of preparation depth and intraradicular extension on the marginal and internal adaptation of computer-aided design/computer-assisted manufacture (CAD/CAM) endocrown restorations. Standardized preparations were made in resin endodontic tooth models (Nissin Dental), with an intracoronal preparation depth of 2 mm (group H2), with extra 1- (group H3) or 2-mm (group H4) intraradicular extensions in the root canals (n=12). Vita Enamic polymer-infiltrated ceramic-network material endocrowns were fabricated using the CEREC AC CAD/CAM system and were seated on the prepared teeth. Specimens were evaluated by microtomography. Horizontal and vertical tomographic sections were recorded and reconstructed by using the CTSkan software (TView v1.1, Skyscan).The surface/void volume (S/V) in the region of interest was calculated. Marginal gap (MG), absolute marginal discrepancy (MD), and internal marginal gap were measured at various measuring locations and calculated in microscale (μm). Marginal and internal discrepancy data (μm) were analyzed with nonparametric Kruskal-Wallis analysis of variance by ranks with Dunn's post hoc, whereas S/V data were analyzed by one-way analysis of variance and Bonferroni multiple comparisons (α=0.05). Significant differences were found in MG, MD, and internal gap width values between the groups, with H2 showing the lowest values from all groups. S/V calculations presented significant differences between H2 and the other two groups (H3 and H4) tested, with H2 again showing the lowest values. Increasing the intraradicular extension of endocrown restorations increased the marginal and internal gap of endocrown restorations.

  19. Test and control computer user's guide for a digital beam former test system

    Science.gov (United States)

    Alexovich, Robert E.; Mallasch, Paul G.

    1992-01-01

    A Digital Beam Former Test System was developed to determine the effects of noise, interferers and distortions, and digital implementations of beam forming as applied to the Tracking and Data Relay Satellite 2 (TDRS 2) architectures. The investigation of digital beam forming with application to TDRS 2 architectures, as described in TDRS 2 advanced concept design studies, was conducted by the NASA/Lewis Research Center for NASA/Goddard Space Flight Center. A Test and Control Computer (TCC) was used as the main controlling element of the digital Beam Former Test System. The Test and Control Computer User's Guide for a Digital Beam Former Test System provides an organized description of the Digital Beam Former Test System commands. It is written for users who wish to conduct tests of the Digital Beam forming Test processor using the TCC. The document describes the function, use, and syntax of the TCC commands available to the user while summarizing and demonstrating the use of the commands wtihin DOS batch files.

  20. A semiautomated computer-interactive dynamic impact testing system

    International Nuclear Information System (INIS)

    Alexander, D.J.; Nanstad, R.K.; Corwin, W.R.; Hutton, J.T.

    1989-01-01

    A computer-assisted semiautomated system has been developed for testing a variety of specimen types under dynamic impact conditions. The primary use of this system is for the testing of Charpy specimens. Full-, half-, and third-size specimens have been tested, both in the lab and remotely in a hot cell for irradiated specimens. Specimens are loaded into a transfer device which moves the specimen into a chamber, where a hot air gun is used to heat the specimen, or cold nitrogen gas is used for cooling, as required. The specimen is then quickly transferred from the furnace to the anvils and then broken. This system incorporates an instrumented tup to determine the change in voltage during the fracture process. These data are analyzed by the computer system after the test is complete. The voltage-time trace is recorded with a digital oscilloscope, transferred to the computer, and analyzed. The analysis program incorporates several unique features. It interacts with the operator and identifies the maximum voltage during the test, the amount of rapid fracture during the test (if any), and the end of the fracture process. The program then calculates the area to maximum voltage and the total area under the voltage-time curve. The data acquisition and analysis part of the system can also be used to conduct other dynamic testing. Dynamic tear and precracked specimens can be tested with an instrumented tup and analyzed in a similar manner. 3 refs., 7 figs

  1. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  2. Automatically tuned adaptive differencing algorithm for 3-D SN implemented in PENTRAN

    International Nuclear Information System (INIS)

    Sjoden, G.; Courau, T.; Manalo, K.; Yi, C.

    2009-01-01

    We present an adaptive algorithm with an automated tuning feature to augment optimum differencing scheme selection for 3-D S N computations in Cartesian geometry. This adaptive differencing scheme has been implemented in the PENTRAN parallel S N code. Individual fixed zeroth spatial transport moment based schemes, including Diamond Zero (DZ), Directional Theta Weighted (DTW), and Exponential Directional Iterative (EDI) 3-D S N methods were evaluated and compared with solutions generated using a code-tuned adaptive algorithm. Model problems considered include a fixed source slab problem (using reflected y- and z-axes) which contained mixed shielding and diffusive regions, and a 17 x 17 PWR assembly eigenvalue test problem; these problems were benchmarked against multigroup MCNP5 Monte Carlo computations. Both problems were effective in highlighting the performance of the adaptive scheme compared to single schemes, and demonstrated that the adaptive tuning handles exceptions to the standard DZ-DTW-EDI adaptive strategy. The tuning feature includes special scheme selection provisions for optically thin cells, and incorporates the ratio of the angular source density relative to the total angular collision density to best select the differencing method. Overall, the adaptive scheme demonstrated the best overall solution accuracy in the test problems. (authors)

  3. Translation and cultural adaptation of the Aguado Syntax Test (AST) into Brazilian Portuguese.

    Science.gov (United States)

    Baggio, Gustavo Inheta; Hage, Simone Rocha de Vasconcellos

    2017-12-07

    To perform the translation and cultural adaptation of the Aguado Syntax Test (AST) into Brazilian Portuguese considering the linguistic and cultural reality of the language. The AST assesses the early morphosyntactic development in children aged 3 to 7 in terms of understanding and expression of various types of structures such as sentences, pronouns, verbal voices, comparisons, prepositions and verbal desinence as to number, mode and tense. The process of translation and cultural adaptation followed four steps: 1) preparation of two translations; 2) synthesis of consensual translations; 3) backtranslation; and 4) verification of equivalence between the initial translations and backtranslations that resulted in the final translated version. The whole process of translation and cultural adaptation revealed the presence of equivalence and reconciliation of the translated items and an almost complete semantic equivalence between the two translations and the absence of consistent translation difficulties. The AST was translated and culturally adapted into Brazilian Portuguese, constituting the first step towards validation and standardization of the test.

  4. Translation, Cultural Adaptation and Validation of the Simple Shoulder Test to Spanish

    Science.gov (United States)

    Arcuri, Francisco; Barclay, Fernando; Nacul, Ivan

    2015-01-01

    Background: The validation of widely used scales facilitates the comparison across international patient samples. Objective: The objective was to translate, culturally adapt and validate the Simple Shoulder Test into Argentinian Spanish. Methods: The Simple Shoulder Test was translated from English into Argentinian Spanish by two independent translators, translated back into English and evaluated for accuracy by an expert committee to correct the possible discrepancies. It was then administered to 50 patients with different shoulder conditions.Psycometric properties were analyzed including internal consistency, measured with Cronbach´s Alpha, test-retest reliability at 15 days with the interclass correlation coefficient. Results: The internal consistency, validation, was an Alpha of 0,808, evaluated as good. The test-retest reliability index as measured by intra-class correlation coefficient (ICC) was 0.835, evaluated as excellent. Conclusion: The Simple Shoulder Test translation and it´s cultural adaptation to Argentinian-Spanish demonstrated adequate internal reliability and validity, ultimately allowing for its use in the comparison with international patient samples.

  5. Usability of an adaptive computer assistant that improves self-care and health literacy of older adults

    NARCIS (Netherlands)

    Blanson Henkemans, O.A.; Rogers, W.A.; Fisk, A.D.; Neerincx, M.A.; Lindenberg, J.; Mast, C.A.P.G. van der

    2008-01-01

    Objectives: We developed an adaptive computer assistant for the supervision of diabetics' self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient's electronic diary. Accordingly, it provides

  6. Computer Based Test Untuk Seleksi Masuk Politeknik Negeri Bengkalis

    Directory of Open Access Journals (Sweden)

    Agus Tedyyana

    2017-11-01

    Full Text Available AbstrakPenyeleksian calon mahasiswa baru dapat dilakukan dengan aplikasi Computer Based Test (CBT. Metode yang digunakan meliputi teknik pengumpulan data, analisis sistem, model perancangan, implementasi dan pengujian. Penelitian ini menghasilkan aplikasi CBT dimana soal yang dimunculkan dari bank soal melalui proses pengacakan dengan tidak akan memunculkan soal yang sama dengan menggunakan metoda Fisher-Yates Shuffle. Dalam proses pengamanan informasi soal saat terhubung ke jaringan maka diperlukan teknik untuk penyandian pesan agar soal tersebut sebeum dimunculkan melewati proses enkripsi dan deskripsi data terlebih dahulu maka digunakan algoritma kriptografi  RSA. Metode perancangan perangkat lunak menggunakan model waterfall, perancangan database menggunakan entity relationship diagram, perancangan antarmuka menggunakan hypertext markup language (HTML Cascading Style Sheet (CSS dan jQuery serta diimplementasikan berbasis web dengan menggunakan bahasa pemrograman PHP dan database MySQL, Arsitektur jaringan yang digunakan aplikasi Computer Based Test adalah model jaringan client-server dengan jaringan Local Area Network (LAN. Kata kunci: Computer Based Test, Fisher-Yates Shuffle, Criptography, Local Area Network AbstractSelection of new student candidates can be done with Computer Based Test (CBT application. The methods used include data collection techniques, system analysis, design model, implementation and testing. This study produces a CBT application where the questions raised from the question bank through randomization process will not bring up the same problem using the Fisher-Yates Shuffle method. In the process of securing information about the problem when connected to the network it is necessary techniques for encoding the message so that the problem before appear through the process of encryption and description of data first then used RSA cryptography algorithm. Software design method using waterfall model, database design

  7. Temperature dependence of magnetic descriptors of Magnetic Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Vértesy, G.; Uchimoto, T.; Tomáš, Ivan; Takagi, T.

    2010-01-01

    Roč. 46, č. 2 (2010), s. 509-512 ISSN 0018-9464 R&D Projects: GA ČR GA101/09/1323; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * magnetic hysteresis Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.052, year: 2010

  8. Automated testing of health physics instruments

    International Nuclear Information System (INIS)

    Swinth, K.L.; Endres, A.W.; Hadley, R.T.; Kenoyer, J.L.

    1983-12-01

    A microcomputer controlled CAMAC system has been adapted for automated testing of health physics survey instruments. Once the survey instrument is positioned, the system automatically performs tests for angular dependence or battery lifetime. Rotation of the instrument is performed by a computer controlled stepping motor while readout is performed by an auto ranging digital volt meter and data stored on computer disks

  9. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the efficient test-pattern generation in a core-based design. A consistent Computer-Aided Test (CAT) flow is proposed based on the required core-test strategy. It generates a test-pattern set for the embedded cores with high fault coverage and low DfT area overhead. The CAT

  10. Students' Attitude toward and Acceptability of Computerized Adaptive Testing in Medical School and their Effect on the Examinees' Ability

    Directory of Open Access Journals (Sweden)

    Mee Young Kim

    2005-06-01

    Full Text Available An examinee's ability can be evaluated precisely using computerized adaptive testing (CAT, which is shorter than written tests and more efficient in terms of the duration of the examination. We used CAT for the second General Examination of 98 senior students in medical college on November 27, 2004. We prepared 1,050 pre-calibrated test items according to item response theory, which had been used for the General Examination administered to senior students in 2003. The computer was programmed to pose questions until the standard error of the ability estimate was smaller than 0.01. To determine the students' attitude toward and evaluation of CAT, we conducted surveys before and after the examination, via the Web. The mean of the students' ability estimates was 0.3513 and its standard deviation was 0.9097 (range -2.4680 to +2.5310. There was no significant difference in the ability estimates according to the responses of students to items concerning their experience with CAT, their ability to use a computer, or their anxiety before and after the examination (p>0.05. Many students were unhappy that they could not recheck their responses (49%, and some stated that there were too few examination items (24%. Of the students, 79 % had no complaints concerning using a computer and 63% wanted to expand the use of CAT. These results indicate that CAT can be implemented in medical schools without causing difficulties for users.

  11. Using Tests Designed to Measure Individual Sensorimotor Subsystem Perfomance to Predict Locomotor Adaptability

    Science.gov (United States)

    Peters, B. T.; Caldwell, E. E.; Batson, C. D.; Guined, J. R.; DeDios, Y. E.; Stepanyan, V.; Gadd, N. E.; Szecsy, D. L.; Mulavara, A. P.; Seidler, R. D.; hide

    2014-01-01

    Astronauts experience sensorimotor disturbances during the initial exposure to microgravity and during the readapation phase following a return to a gravitational environment. These alterations may lead to disruption in the ability to perform mission critical functions during and after these gravitational transitions. Astronauts show significant inter-subject variation in adaptive capability following gravitational transitions. The way each individual's brain synthesizes the available visual, vestibular and somatosensory information is likely the basis for much of the variation. Identifying the presence of biases in each person's use of information available from these sensorimotor subsystems and relating it to their ability to adapt to a novel locomotor task will allow us to customize a training program designed to enhance sensorimotor adaptability. Eight tests are being used to measure sensorimotor subsystem performance. Three of these use measures of body sway to characterize balance during varying sensorimotor challenges. The effect of vision is assessed by repeating conditions with eyes open and eyes closed. Standing on foam, or on a support surface that pitches to maintain a constant ankle angle provide somatosensory challenges. Information from the vestibular system is isolated when vision is removed and the support surface is compromised, and it is challenged when the tasks are done while the head is in motion. The integration and dominance of visual information is assessed in three additional tests. The Rod & Frame Test measures the degree to which a subject's perception of the visual vertical is affected by the orientation of a tilted frame in the periphery. Locomotor visual dependence is determined by assessing how much an oscillating virtual visual world affects a treadmill-walking subject. In the third of the visual manipulation tests, subjects walk an obstacle course while wearing up-down reversing prisms. The two remaining tests include direct

  12. Independent validation testing of the FLAME computer code, Version 1.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions

  13. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  14. Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes.

    Science.gov (United States)

    Chien, Tsair-Wei; Lin, Weir-Sen

    2016-03-02

    The National Health Service (NHS) 70-item inpatient questionnaire surveys inpatients on their perceptions of their hospitalization experience. However, it imposes more burden on the patient than other similar surveys. The literature shows that computerized adaptive testing (CAT) based on item response theory can help shorten the item length of a questionnaire without compromising its precision. Our aim was to investigate whether CAT can be (1) efficient with item reduction and (2) used with quick response (QR) codes scanned by mobile phones. After downloading the 2008 inpatient survey data from the Picker Institute Europe website and analyzing the difficulties of this 70-item questionnaire, we used an author-made Excel program using the Rasch partial credit model to simulate 1000 patients' true scores followed by a standard normal distribution. The CAT was compared to two other scenarios of answering all items (AAI) and the randomized selection method (RSM), as we investigated item length (efficiency) and measurement accuracy. The author-made Web-based CAT program for gathering patient feedback was effectively accessed from mobile phones by scanning the QR code. We found that the CAT can be more efficient for patients answering questions (ie, fewer items to respond to) than either AAI or RSM without compromising its measurement accuracy. A Web-based CAT inpatient survey accessed by scanning a QR code on a mobile phone was viable for gathering inpatient satisfaction responses. With advances in technology, patients can now be offered alternatives for providing feedback about hospitalization satisfaction. This Web-based CAT is a possible option in health care settings for reducing the number of survey items, as well as offering an innovative QR code access.

  15. Adaptive-Predictive Organ Localization Using Cone-Beam Computed Tomography for Improved Accuracy in External Beam Radiotherapy for Bladder Cancer

    International Nuclear Information System (INIS)

    Lalondrelle, Susan; Huddart, Robert; Warren-Oseni, Karole; Hansen, Vibeke Nordmark; McNair, Helen; Thomas, Karen; Dearnaley, David; Horwich, Alan; Khoo, Vincent

    2011-01-01

    Purpose: To examine patterns of bladder wall motion during high-dose hypofractionated bladder radiotherapy and to validate a novel adaptive planning method, A-POLO, to prevent subsequent geographic miss. Methods and Materials: Patterns of individual bladder filling were obtained with repeat computed tomography planning scans at 0, 15, and 30 minutes after voiding. A series of patient-specific plans corresponding to these time-displacement points was created. Pretreatment cone-beam computed tomography was performed before each fraction and assessed retrospectively for adaptive intervention. In fractions that would have required intervention, the most appropriate plan was chosen from the patient's 'library,' and the resulting target coverage was reassessed with repeat cone-beam computed tomography. Results: A large variation in patterns of bladder filling and interfraction displacement was seen. During radiotherapy, predominant translations occurred cranially (maximum 2.5 cm) and anteriorly (maximum 1.75 cm). No apparent explanation was found for this variation using pretreatment patient factors. A need for adaptive planning was demonstrated by 51% of fractions, and 73% of fractions would have been delivered correctly using A-POLO. The adaptive strategy improved target coverage and was able to account for intrafraction motion also. Conclusions: Bladder volume variation will result in geographic miss in a high proportion of delivered bladder radiotherapy treatments. The A-POLO strategy can be used to correct for this and can be implemented from the first fraction of radiotherapy; thus, it is particularly suited to hypofractionated bladder radiotherapy regimens.

  16. PA171 Containers on a Wood Pallet with Metal Top Adapter, Air Pressure Tests During MIL-STD-1660 Tests

    National Research Council Canada - National Science Library

    2004-01-01

    ... (PM-MAS) to conduct Air Pressure Tests during MIL-STD-1660, "Design Criteria for Ammunition Unit Loads" testing on the PA171 containers on a wood pallet with metal top adapter as manufactured by Alliant Tech...

  17. Adaptive Test Schemes for Control of Paratuberculosis in Dairy Cows.

    Directory of Open Access Journals (Sweden)

    Carsten Kirkeby

    Full Text Available Paratuberculosis is a chronic infection that in dairy cattle causes reduced milk yield, weight loss, and ultimately fatal diarrhea. Subclinical animals can excrete bacteria (Mycobacterium avium ssp. paratuberculosis, MAP in feces and infect other animals. Farmers identify the infectious animals through a variety of test-strategies, but are challenged by the lack of perfect tests. Frequent testing increases the sensitivity but the costs of testing are a cause of concern for farmers. Here, we used a herd simulation model using milk ELISA tests to evaluate the epidemiological and economic consequences of continuously adapting the sampling interval in response to the estimated true prevalence in the herd. The key results were that the true prevalence was greatly affected by the hygiene level and to some extent by the test-frequency. Furthermore, the choice of prevalence that will be tolerated in a control scenario had a major impact on the true prevalence in the normal hygiene setting, but less so when the hygiene was poor. The net revenue is not greatly affected by the test-strategy, because of the general variation in net revenues between farms. An exception to this is the low hygiene herd, where frequent testing results in lower revenue. When we look at the probability of eradication, then it is correlated with the testing frequency and the target prevalence during the control phase. The probability of eradication is low in the low hygiene herd, and a test-and-cull strategy should probably not be the primary strategy in this herd. Based on this study we suggest that, in order to control MAP, the standard Danish dairy farm should use an adaptive strategy where a short sampling interval of three months is used when the estimated true prevalence is above 1%, and otherwise use a long sampling interval of one year.

  18. Adaptive hybrid brain-computer interaction: ask a trainer for assistance!

    Science.gov (United States)

    Müller-Putz, Gernot R; Steyrl, David; Faller, Josef

    2014-01-01

    In applying mental imagery brain-computer interfaces (BCIs) to end users, training is a key part for novice users to get control. In general learning situations, it is an established concept that a trainer assists a trainee to improve his/her aptitude in certain skills. In this work, we want to evaluate whether we can apply this concept in the context of event-related desynchronization (ERD) based, adaptive, hybrid BCIs. Hence, in a first session we merged the features of a high aptitude BCI user, a trainer, and a novice user, the trainee, in a closed-loop BCI feedback task and automatically adapted the classifier over time. In a second session the trainees operated the system unassisted. Twelve healthy participants ran through this protocol. Along with the trainer, the trainees achieved a very high overall peak accuracy of 95.3 %. In the second session, where users operated the BCI unassisted, they still achieved a high overall peak accuracy of 83.6%. Ten of twelve first time BCI users successfully achieved significantly better than chance accuracy. Concluding, we can say that this trainer-trainee approach is very promising. Future research should investigate, whether this approach is superior to conventional training approaches. This trainer-trainee concept could have potential for future application of BCIs to end users.

  19. Flight Test of an L(sub 1) Adaptive Controller on the NASA AirSTAR Flight Test Vehicle

    Science.gov (United States)

    Gregory, Irene M.; Xargay, Enric; Cao, Chengyu; Hovakimyan, Naira

    2010-01-01

    This paper presents results of a flight test of the L-1 adaptive control architecture designed to directly compensate for significant uncertain cross-coupling in nonlinear systems. The flight test was conducted on the subscale turbine powered Generic Transport Model that is an integral part of the Airborne Subscale Transport Aircraft Research system at the NASA Langley Research Center. The results presented are for piloted tasks performed during the flight test.

  20. Development of a computerized adaptive test for Schizotypy assessment.

    Directory of Open Access Journals (Sweden)

    Eduardo Fonseca-Pedrero

    Full Text Available BACKGROUND: Schizotypal traits in adolescents from the general population represent the behavioral expression of liability for psychotic disorders. Schizotypy assessment in this sector of population has advanced considerably in the last few years; however, it is necessary to incorporate recent advances in psychological and educational measurement. OBJECTIVE: The main goal of this study was to develop a Computerized Adaptive Test (CAT to evaluate schizotypy through "The Oviedo Questionnaire for Schizotypy Assessment" (ESQUIZO-Q, in non-clinical adolescents. METHODS: The final sample consisted of 3,056 participants, 1,469 males, with a mean age of 15.9 years (SD=1.2. RESULTS: The results indicated that the ESQUIZO-Q scores presented adequate psychometric properties under both Classical Test Theory and Item Response Theory. The Information Function estimated using the Gradual Response Model indicated that the item pool effectively assesses schizotypy at the high end of the latent trait. The correlation between the CAT total scores and the paper-and-pencil test was 0.92. The mean number of presented items in the CAT with the standard error fixed at ≤ 0.30 was of 34 items. CONCLUSION: The CAT showed adequate psychometric properties for schizotypy assessment in the general adolescent population. The ESQUIZO-Q adaptive version could be used as a screening method for the detection of adolescents at risk for psychosis in both educational and mental health settings.

  1. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  2. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali; Ltaief, Hatem; Gratadour, Damien; Keyes, David E.; Sevin, Arnaud; Abdelfattah, Ahmad; Gendron, Eric; Morel, Carine; Vidal, Fabrice

    2014-01-01

    called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used

  3. THE PLUTO CODE FOR ADAPTIVE MESH COMPUTATIONS IN ASTROPHYSICAL FLUID DYNAMICS

    International Nuclear Information System (INIS)

    Mignone, A.; Tzeferacos, P.; Zanni, C.; Bodo, G.; Van Straalen, B.; Colella, P.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  4. Test bank to accompany Computers data and processing

    CERN Document Server

    Deitel, Harvey M

    1980-01-01

    Test Bank to Accompany Computers and Data Processing provides a variety of questions from which instructors can easily custom tailor exams appropriate for their particular courses. This book contains over 4000 short-answer questions that span the full range of topics for introductory computing course.This book is organized into five parts encompassing 19 chapters. This text provides a very large number of questions so that instructors can produce different exam testing essentially the same topics in succeeding semesters. Three types of questions are included in this book, including multiple ch

  5. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  6. A minimax sequential procedure in the context of computerized adaptive mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1997-01-01

    The purpose of this paper is to derive optimal rules for variable-length mastery tests in case three mastery classification decisions (nonmastery, partial mastery, and mastery) are distinguished. In a variable-length or adaptive mastery test, the decision is to classify a subject as a master, a

  7. Computer-controlled environmental test systems - Criteria for selection, installation, and maintenance.

    Science.gov (United States)

    Chapman, C. P.

    1972-01-01

    Applications for presently marketed, new computer-controlled environmental test systems are suggested. It is shown that capital costs of these systems follow an exponential cost function curve that levels out as additional applications are implemented. Some test laboratory organization changes are recommended in terms of new personnel requirements, and facility modification are considered in support of a computer-controlled test system. Software for computer-controlled test systems are discussed, and control loop speed constraints are defined for real-time control functions. Suitable input and output devices and memory storage device tradeoffs are also considered.

  8. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  9. L(sub 1) Adaptive Control Design for NASA AirSTAR Flight Test Vehicle

    Science.gov (United States)

    Gregory, Irene M.; Cao, Chengyu; Hovakimyan, Naira; Zou, Xiaotian

    2009-01-01

    In this paper we present a new L(sub 1) adaptive control architecture that directly compensates for matched as well as unmatched system uncertainty. To evaluate the L(sub 1) adaptive controller, we take advantage of the flexible research environment with rapid prototyping and testing of control laws in the Airborne Subscale Transport Aircraft Research system at the NASA Langley Research Center. We apply the L(sub 1) adaptive control laws to the subscale turbine powered Generic Transport Model. The presented results are from a full nonlinear simulation of the Generic Transport Model and some preliminary pilot evaluations of the L(sub 1) adaptive control law.

  10. Flight Test Comparison of Different Adaptive Augmentations for Fault Tolerant Control Laws for a Modified F-15 Aircraft

    Science.gov (United States)

    Burken, John J.; Hanson, Curtis E.; Lee, James A.; Kaneshige, John T.

    2009-01-01

    This report describes the improvements and enhancements to a neural network based approach for directly adapting to aerodynamic changes resulting from damage or failures. This research is a follow-on effort to flight tests performed on the NASA F-15 aircraft as part of the Intelligent Flight Control System research effort. Previous flight test results demonstrated the potential for performance improvement under destabilizing damage conditions. Little or no improvement was provided under simulated control surface failures, however, and the adaptive system was prone to pilot-induced oscillations. An improved controller was designed to reduce the occurrence of pilot-induced oscillations and increase robustness to failures in general. This report presents an analysis of the neural networks used in the previous flight test, the improved adaptive controller, and the baseline case with no adaptation. Flight test results demonstrate significant improvement in performance by using the new adaptive controller compared with the previous adaptive system and the baseline system for control surface failures.

  11. Soft computing methods for geoidal height transformation

    Science.gov (United States)

    Akyilmaz, O.; Özlüdemir, M. T.; Ayan, T.; Çelik, R. N.

    2009-07-01

    Soft computing techniques, such as fuzzy logic and artificial neural network (ANN) approaches, have enabled researchers to create precise models for use in many scientific and engineering applications. Applications that can be employed in geodetic studies include the estimation of earth rotation parameters and the determination of mean sea level changes. Another important field of geodesy in which these computing techniques can be applied is geoidal height transformation. We report here our use of a conventional polynomial model, the Adaptive Network-based Fuzzy (or in some publications, Adaptive Neuro-Fuzzy) Inference System (ANFIS), an ANN and a modified ANN approach to approximate geoid heights. These approximation models have been tested on a number of test points. The results obtained through the transformation processes from ellipsoidal heights into local levelling heights have also been compared.

  12. Adaptive user interfaces

    CERN Document Server

    1990-01-01

    This book describes techniques for designing and building adaptive user interfaces developed in the large AID project undertaken by the contributors.Key Features* Describes one of the few large-scale adaptive interface projects in the world* Outlines the principles of adaptivity in human-computer interaction

  13. Accelerated Desensitization with Adaptive Attitudes and Test Gains with 5th Graders

    Science.gov (United States)

    Miller, Melanie; Morton, Jerome; Driscoll, Richard; Davis, Kai A.

    2006-01-01

    The study evaluates an easily-administered test-anxiety reduction program. An entire fifth grade was screened, and 36 students identified as test-anxious were randomly assigned to an Intervention or a non-participant Control group. The intervention was an accelerated desensitization and adaptive attitudes (ADAA) treatment which involved…

  14. A stand alone computer system to aid the development of mirror fusion test facility RF heating systems

    International Nuclear Information System (INIS)

    Thomas, R.A.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  15. Adaptive phase measurements in linear optical quantum computation

    International Nuclear Information System (INIS)

    Ralph, T C; Lund, A P; Wiseman, H M

    2005-01-01

    Photon counting induces an effective non-linear optical phase shift in certain states derived by linear optics from single photons. Although this non-linearity is non-deterministic, it is sufficient in principle to allow scalable linear optics quantum computation (LOQC). The most obvious way to encode a qubit optically is as a superposition of the vacuum and a single photon in one mode-so-called 'single-rail' logic. Until now this approach was thought to be prohibitively expensive (in resources) compared to 'dual-rail' logic where a qubit is stored by a photon across two modes. Here we attack this problem with real-time feedback control, which can realize a quantum-limited phase measurement on a single mode, as has been recently demonstrated experimentally. We show that with this added measurement resource, the resource requirements for single-rail LOQC are not substantially different from those of dual-rail LOQC. In particular, with adaptive phase measurements an arbitrary qubit state α vertical bar 0>+β vertical bar 1> can be prepared deterministically

  16. Rotationally Adaptive Flight Test Surface

    Science.gov (United States)

    Barrett, Ron

    1999-01-01

    Research on a new design of flutter exciter vane using adaptive materials was conducted. This novel design is based on all-moving aerodynamic surface technology and consists of a structurally stiff main spar, a series of piezoelectric actuator elements and an aerodynamic shell which is pivoted around the main spar. The work was built upon the current missile-type all-moving surface designs and change them so they are better suited for flutter excitation through the transonic flight regime. The first portion of research will be centered on aerodynamic and structural modeling of the system. USAF DatCom and vortex lattice codes was used to capture the fundamental aerodynamics of the vane. Finite element codes and laminated plate theory and virtual work analyses will be used to structurally model the aerodynamic vane and wing tip. Following the basic modeling, a flutter test vane was designed. Each component within the structure was designed to meet the design loads. After the design loads are met, then the deflections will be maximized and the internal structure will be laid out. In addition to the structure, a basic electrical control network will be designed which will be capable of driving a scaled exciter vane. The third and final stage of main investigation involved the fabrication of a 1/4 scale vane. This scaled vane was used to verify kinematics and structural mechanics theories on all-moving actuation. Following assembly, a series of bench tests was conducted to determine frequency response, electrical characteristics, mechanical and kinematic properties. Test results indicate peak-to-peak deflections of 1.1 deg with a corner frequency of just over 130 Hz.

  17. An Adaptive and Integrated Low-Power Framework for Multicore Mobile Computing

    Directory of Open Access Journals (Sweden)

    Jongmoo Choi

    2017-01-01

    Full Text Available Employing multicore in mobile computing such as smartphone and IoT (Internet of Things device is a double-edged sword. It provides ample computing capabilities required in recent intelligent mobile services including voice recognition, image processing, big data analysis, and deep learning. However, it requires a great deal of power consumption, which causes creating a thermal hot spot and putting pressure on the energy resource in a mobile device. In this paper, we propose a novel framework that integrates two well-known low-power techniques, DPM (Dynamic Power Management and DVFS (Dynamic Voltage and Frequency Scaling for energy efficiency in multicore mobile systems. The key feature of the proposed framework is adaptability. By monitoring the online resource usage such as CPU utilization and power consumption, the framework can orchestrate diverse DPM and DVFS policies according to workload characteristics. Real implementation based experiments using three mobile devices have shown that it can reduce the power consumption ranging from 22% to 79%, while affecting negligibly the performance of workloads.

  18. Adaptation of OCA-P, a probabilistic fracture-mechanics code, to a personal computer

    International Nuclear Information System (INIS)

    Ball, D.G.; Cheverton, R.D.

    1985-01-01

    The OCA-P probabilistic fracture-mechanics code can now be executed on a personal computer with 512 kilobytes of memory, a math coprocessor, and a hard disk. A user's guide for the particular adaptation has been prepared, and additional importance sampling techniques for OCA-P have been developed that allow the sampling of only the tails of selected distributions. Features have also been added to OCA-P that permit RTNDT to be used as an ''independent'' variable in the calculation of P

  19. High Performance Computing Modernization Program Kerberos Throughput Test Report

    Science.gov (United States)

    2017-10-26

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5524--17-9751 High Performance Computing Modernization Program Kerberos Throughput Test ...NUMBER 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2. REPORT TYPE1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 6. AUTHOR(S) 8. PERFORMING...PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT High Performance Computing Modernization Program Kerberos Throughput Test Report Daniel G. Gdula* and

  20. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    Science.gov (United States)

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  1. USING COMPUTER-BASED TESTING AS ALTERNATIVE ASSESSMENT METHOD OF STUDENT LEARNING IN DISTANCE EDUCATION

    Directory of Open Access Journals (Sweden)

    Amalia SAPRIATI

    2010-04-01

    Full Text Available This paper addresses the use of computer-based testing in distance education, based on the experience of Universitas Terbuka (UT, Indonesia. Computer-based testing has been developed at UT for reasons of meeting the specific needs of distance students as the following: Ø students’ inability to sit for the scheduled test, Ø conflicting test schedules, and Ø students’ flexibility to take examination to improve their grades. In 2004, UT initiated a pilot project in the development of system and program for computer-based testing method. Then in 2005 and 2006 tryouts in the use of computer-based testing methods were conducted in 7 Regional Offices that were considered as having sufficient supporting recourses. The results of the tryouts revealed that students were enthusiastic in taking computer-based tests and they expected that the test method would be provided by UT as alternative to the traditional paper and pencil test method. UT then implemented computer-based testing method in 6 and 12 Regional Offices in 2007 and 2008 respectively. The computer-based testing was administered in the city of the designated Regional Office and was supervised by the Regional Office staff. The development of the computer-based testing was initiated with conducting tests using computers in networked configuration. The system has been continually improved, and it currently uses devices linked to the internet or the World Wide Web. The construction of the test involves the generation and selection of the test items from the item bank collection of the UT Examination Center. Thus the combination of the selected items compromises the test specification. Currently UT has offered 250 courses involving the use of computer-based testing. Students expect that more courses are offered with computer-based testing in Regional Offices within easy access by students.

  2. Specifying colours for colour vision testing using computer graphics.

    Science.gov (United States)

    Toufeeq, A

    2004-10-01

    This paper describes a novel test of colour vision using a standard personal computer, which is simple and reliable to perform. Twenty healthy individuals with normal colour vision and 10 healthy individuals with a red/green colour defect were tested binocularly at 13 selected points in the CIE (Commission International d'Eclairage, 1931) chromaticity triangle, representing the gamut of a computer monitor, where the x, y coordinates of the primary colour phosphors were known. The mean results from individuals with normal colour vision were compared to those with defective colour vision. Of the 13 points tested, five demonstrated consistently high sensitivity in detecting colour defects. The test may provide a convenient method for classifying colour vision abnormalities.

  3. Accelerated Desensitization and Adaptive Attitudes Interventions and Test Gains with Academic Probation Students

    Science.gov (United States)

    Driscoll, Richard; Holt, Bruce; Hunter, Lori

    2005-01-01

    The study evaluates the test-gain benefits of an accelerated desensitization and adaptive attitudes intervention for test-anxious students. College students were screened for high test anxiety. Twenty anxious students, half of them on academic probation, were assigned to an Intervention or to a minimal treatment Control group. The Intervention was…

  4. A Comparison of Procedures for Content-Sensitive Item Selection in Computerized Adaptive Tests.

    Science.gov (United States)

    Kingsbury, G. Gage; Zara, Anthony R.

    1991-01-01

    This simulation investigated two procedures that reduce differences between paper-and-pencil testing and computerized adaptive testing (CAT) by making CAT content sensitive. Results indicate that the price in terms of additional test items of using constrained CAT for content balancing is much smaller than that of using testlets. (SLD)

  5. Dark adaptation during transient hyperglycemia in type 2 diabetes

    DEFF Research Database (Denmark)

    Holfort, Stig Kraglund; Jackson, Gregory R; Larsen, Michael

    2010-01-01

    adaptometry was measured in one eye, chosen at random, using a computer-controlled dark adaptometer. Dark adaptation and capillary blood glucose were measured at baseline and 80 minutes into the OGTT/fasting test. Blood glucose remained stable throughout the examination in the 12 fasting subjects, whereas......It was the purpose of the present study to examine dark adaptation in subjects with type 2 diabetes during transient hyperglycemia. Twenty-four subjects with type 2 diabetes and minimal diabetic retinopathy were randomized to undergo an oral glucose tolerance test (OGTT) or to remain fasting. Dark...... glycemia increased in the 12 OGTT subjects, from 8.6±2.1 at baseline to 21.1±3.6 mM after 80 min. In the OGTT group, four out of seven subjects with delayed dark adaptation at baseline reached normal values during hyperglycemia. All examined aspects of rod adaptation were accelerated by hyperglycemia (time...

  6. A Framework for the Development of Computerized Adaptive Tests

    Directory of Open Access Journals (Sweden)

    Nathan A. Thompson

    2011-01-01

    Full Text Available A substantial amount of research has been conducted over the past 40 years on technical aspects of computerized adaptive testing (CAT, such as item selection algorithms, item exposure controls, and termination criteria. However, there is little literature providing practical guidance on the development of a CAT. This paper seeks to collate some of the available research methodologies into a general framework for the development of any CAT assessment.

  7. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  8. Design Of Computer Based Test Using The Unified Modeling Language

    Science.gov (United States)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  9. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    Science.gov (United States)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  10. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    Science.gov (United States)

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  11. A review of culturally adapted versions of the Oswestry Disability Index: the adaptation process, construct validity, test-retest reliability and internal consistency.

    Science.gov (United States)

    Sheahan, Peter J; Nelson-Wong, Erika J; Fischer, Steven L

    2015-01-01

    The Oswestry Disability Index (ODI) is a self-report-based outcome measure used to quantify the extent of disability related to low back pain (LBP), a substantial contributor to workplace absenteeism. The ODI tool has been adapted for use by patients in several non-English speaking nations. It is unclear, however, if these adapted versions of the ODI are as credible as the original ODI developed for English-speaking nations. The objective of this study was to conduct a review of the literature to identify culturally adapted versions of the ODI and to report on the adaptation process, construct validity, test-retest reliability and internal consistency of these ODIs. Following a pragmatic review process, data were extracted from each study with regard to these four outcomes. While most studies applied adaptation processes in accordance with best-practice guidelines, there were some deviations. However, all studies reported high-quality psychometric properties: group mean construct validity was 0.734 ± 0.094 (indicated via a correlation coefficient), test-retest reliability was 0.937 ± 0.032 (indicated via an intraclass correlation coefficient) and internal consistency was 0.876 ± 0.047 (indicated via Cronbach's alpha). Researchers can be confident when using any of these culturally adapted ODIs, or when comparing and contrasting results between cultures where these versions were employed. Implications for Rehabilitation Low back pain is the second leading cause of disability in the world, behind only cancer. The Oswestry Disability Index (ODI) has been developed as a self-report outcome measure of low back pain for administration to patients. An understanding of the various cross-cultural adaptations of the ODI is important for more concerted multi-national research efforts. This review examines 16 cross-cultural adaptations of the ODI and should inform the work of health care and rehabilitation professionals.

  12. Adaptation of a Vocabulary Test from British Sign Language to American Sign Language

    Science.gov (United States)

    Mann, Wolfgang; Roy, Penny; Morgan, Gary

    2016-01-01

    This study describes the adaptation process of a vocabulary knowledge test for British Sign Language (BSL) into American Sign Language (ASL) and presents results from the first round of pilot testing with 20 deaf native ASL signers. The web-based test assesses the strength of deaf children's vocabulary knowledge by means of different mappings of…

  13. Cross-Mode Comparability of Computer-Based Testing (CBT) versus Paper-Pencil Based Testing (PPT): An Investigation of Testing Administration Mode among Iranian Intermediate EFL Learners

    Science.gov (United States)

    Khoshsima, Hooshang; Hosseini, Monirosadat; Toroujeni, Seyyed Morteza Hashemi

    2017-01-01

    Advent of technology has caused growing interest in using computers to convert conventional paper and pencil-based testing (Henceforth PPT) into Computer-based testing (Henceforth CBT) in the field of education during last decades. This constant promulgation of computers to reshape the conventional tests into computerized format permeated the…

  14. Computer board for radioactive ray test

    International Nuclear Information System (INIS)

    Zuo Mingfu

    1996-05-01

    The present status of the radioactive-ray test system for industrial applications, the newly designed computer board for overcoming the shortcomings of the current system are described. The functions, measurement principles and the feature of the board as well as the test results for this board are discussed. The board puts together many functions of the radioactive-ray test system, such as energy calibration, MCS, etc.. It also provides many other subordinate practical function such as motor control, ADC and so on. The board summarizes two sets of test parts into one and therefore composes a powerful unit for the system. Not only can it replace all units in a normal test system for signal analysis, signal process, data management, and motor control, but also can be used in more complex test systems, such as those for double source/double energy/double channel testing, multichannel testing, position testing and core positioning, etc.. The board makes the test system more easier to achieve miniaturization, computerization goals, and therefore improves the quality of the test and reduces the cost of the system. (10 refs., 8 figs.)

  15. Nondestructive evaluation of low carbon steel by magnetic adaptive testing

    Czech Academy of Sciences Publication Activity Database

    Vértesy, G.; Tomáš, Ivan; Kobayashi, S.

    2010-01-01

    Roč. 25, č. 2 (2010), s. 125-132 ISSN 1058-9759 R&D Projects: GA ČR GA102/06/0866; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * steel * magnetic hysteresis Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 0.771, year: 2010

  16. Testing for adaptive evolution of the female reproductive protein ZPC in mammals, birds and fishes reveals problems with the M7-M8 likelihood ratio test.

    Science.gov (United States)

    Berlin, Sofia; Smith, Nick G C

    2005-11-10

    Adaptive evolution appears to be a common feature of reproductive proteins across a very wide range of organisms. A promising way of addressing the evolutionary forces responsible for this general phenomenon is to test for adaptive evolution in the same gene but among groups of species, which differ in their reproductive biology. One can then test evolutionary hypotheses by asking whether the variation in adaptive evolution is consistent with the variation in reproductive biology. We have attempted to apply this approach to the study of a female reproductive protein, zona pellucida C (ZPC), which has been previously shown by the use of likelihood ratio tests (LRTs) to be under positive selection in mammals. We tested for evidence of adaptive evolution of ZPC in 15 mammalian species, in 11 avian species and in six fish species using three different LRTs (M1a-M2a, M7-M8, and M8a-M8). The only significant findings of adaptive evolution came from the M7-M8 test in mammals and fishes. Since LRTs of adaptive evolution may yield false positives in some situations, we examined the properties of the LRTs by several different simulation methods. When we simulated data to test the robustness of the LRTs, we found that the pattern of evolution in ZPC generates an excess of false positives for the M7-M8 LRT but not for the M1a-M2a or M8a-M8 LRTs. This bias is strong enough to have generated the significant M7-M8 results for mammals and fishes. We conclude that there is no strong evidence for adaptive evolution of ZPC in any of the vertebrate groups we studied, and that the M7-M8 LRT can be biased towards false inference of adaptive evolution by certain patterns of non-adaptive evolution.

  17. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  18. Radiographic test phantom for computed tomographic lung nodule analysis

    International Nuclear Information System (INIS)

    Zerhouni, E.A.

    1987-01-01

    This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs

  19. Sleep deprivation selectively disrupts top-down adaptation to cognitive conflict in the Stroop test.

    Science.gov (United States)

    Gevers, Wim; Deliens, Gaetane; Hoffmann, Sophie; Notebaert, Wim; Peigneux, Philippe

    2015-12-01

    Sleep deprivation is known to exert detrimental effects on various cognitive domains, including attention, vigilance and working memory. Seemingly at odds with these findings, prior studies repeatedly failed to evidence an impact of prior sleep deprivation on cognitive interference in the Stroop test, a hallmark paradigm in the study of cognitive control abilities. The present study investigated further the effect of sleep deprivation on cognitive control using an adapted version of the Stroop test that allows to segregate top-down (attentional reconfiguration on incongruent items) and bottom-up (facilitated processing after repetitions in responses and/or features of stimuli) components of performance. Participants underwent a regular night of sleep or a night of total sleep deprivation before cognitive testing. Results disclosed that sleep deprivation selectively impairs top-down adaptation mechanisms: cognitive control no longer increased upon detection of response conflict at the preceding trial. In parallel, bottom-up abilities were found unaffected by sleep deprivation: beneficial effects of stimulus and response repetitions persisted. Changes in vigilance states due to sleep deprivation selectively impact on cognitive control in the Stroop test by affecting top-down, but not bottom-up, mechanisms that guide adaptive behaviours. © 2015 European Sleep Research Society.

  20. Detection of person misfit in computerized adaptive tests with polytomous items

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    2000-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be estimated inaccurately. For computerized adaptive tests (CAT) with dichotomous items, several person-fit statistics for detecting nonfitting item score patterns have been proposed. Both for

  1. Adaption of the radiation dose for computed tomography of the body - back-ground for the dose adaption programme OmnimAs

    International Nuclear Information System (INIS)

    Nyman, Ulf; Kristiansson, Mattias; Leitz, Wolfram; Paahlstorp, Per-Aake

    2004-11-01

    When performing computed tomography examinations the exposure factors are hardly ever adapted to the patient's size. One reason for that might be the lack of simple methods. In this report the computer programme OmnimAs is described which is calculating how the exposure factors should be varied together with the patient's perimeter (which easily can be measured with a measuring tape). The first approximation is to calculate the exposure values giving the same noise levels in the image irrespective the patient's size. A clinical evaluation has shown that this relationship has to be modified. One chapter is describing the physical background behind the programme. Results calculated with OmnimAs are in good agreement with a number of published studies. Clinical experiences are showing the usability of OmnimAs. Finally the correlation between several parameters and image quality/dose is discussed and how this correlation can be made use of for optimising CT-examinations

  2. Parallelization of Unsteady Adaptive Mesh Refinement for Unstructured Navier-Stokes Solvers

    Science.gov (United States)

    Schwing, Alan M.; Nompelis, Ioannis; Candler, Graham V.

    2014-01-01

    This paper explores the implementation of the MPI parallelization in a Navier-Stokes solver using adaptive mesh re nement. Viscous and inviscid test problems are considered for the purpose of benchmarking, as are implicit and explicit time advancement methods. The main test problem for comparison includes e ects from boundary layers and other viscous features and requires a large number of grid points for accurate computation. Ex- perimental validation against double cone experiments in hypersonic ow are shown. The adaptive mesh re nement shows promise for a staple test problem in the hypersonic com- munity. Extension to more advanced techniques for more complicated ows is described.

  3. Resource-adaptive cognitive processes

    CERN Document Server

    Crocker, Matthew W

    2010-01-01

    This book investigates the adaptation of cognitive processes to limited resources. The central topics of this book are heuristics considered as results of the adaptation to resource limitations, through natural evolution in the case of humans, or through artificial construction in the case of computational systems; the construction and analysis of resource control in cognitive processes; and an analysis of resource-adaptivity within the paradigm of concurrent computation. The editors integrated the results of a collaborative 5-year research project that involved over 50 scientists. After a mot

  4. Self-adaptive method to distinguish inner and outer contours of industrial computed tomography image for rapid prototype

    International Nuclear Information System (INIS)

    Duan Liming; Ye Yong; Zhang Xia; Zuo Jian

    2013-01-01

    A self-adaptive identification method is proposed for realizing more accurate and efficient judgment about the inner and outer contours of industrial computed tomography (CT) slice images. The convexity-concavity of the single-pixel-wide closed contour is identified with angle method at first. Then, contours with concave vertices are distinguished to be inner or outer contours with ray method, and contours without concave vertices are distinguished with extreme coordinate value method. The method was chosen to automatically distinguish contours by means of identifying the convexity and concavity of the contours. Thus, the disadvantages of single distinguishing methods, such as ray method's time-consuming and extreme coordinate method's fallibility, can be avoided. The experiments prove the adaptability, efficiency, and accuracy of the self-adaptive method. (authors)

  5. Adaption and Standardization of the Test of Visual-Motor Skills Revised

    Directory of Open Access Journals (Sweden)

    Mozhgan Farahbod

    2004-06-01

    Full Text Available Objective: This research has been carried out with the aim of adaptation, standardization and finding the validity and reliability of Visual-Motor Skills-revised Test for children. Materials & Methods: A multi-stage sampling from the children of the city of Tehran resulted in a sample of 1281 subjects, ages 2,11 through 13,11.the test consisted of 23 geometric designs and each of the designs was assessed through a definite criteria and was scored as errors(weakness and accuracies(strength.For adaptation and standardization of this test, at first step the examiner`s manual and the test items were translated into Farsi. The final form of the test was obtained after performing the pre-tryout and tryout stages, and doing the data analysis by classic model of reliability. Internal consistency coefficients of the subtests were obtained by Cronbach`s Alpha time consistency of the subtests and compound scores were obtained by test-retest. Alpha coefficients for the compound scores were obtained by Guilford formula, which is designed for estimating the compound scores. To obtain the content validity, criterion-related validity and construct validity of the subtests and compound scores, appropriate methods were used. Results: The results obtained ensure the applicability of this test for the evaluation of visual-motor skills of children of Tehran. Conclusion: According to the findings, this test can be used for the disorders in eye-hand coordination, the identification of children with disorders in visual – motor skills. It can also be used for the documentation of the development of fine – motor skills specially in visual – motor skills in 3-14 years – old children.

  6. Adaptive Test Schemes for Control of Paratuberculosis in Dairy Cows

    DEFF Research Database (Denmark)

    Kirkeby, Carsten Thure; Græsbøll, Kaare; Nielsen, Søren Saxmose

    2016-01-01

    consequences of continuously adapting the sampling interval in response to the estimated true prevalence in the herd. The key results were that the true prevalence was greatly affected by the hygiene level and to some extent by the test-frequency. Furthermore, the choice of prevalence that will be tolerated...... through a variety of test-strategies, but are challenged by the lack of perfect tests. Frequent testing increases the sensitivity but the costs of testing are a cause of concern for farmers. Here, we used a herd simulation model using milk ELISA tests to evaluate the epidemiological and economic...... in a control scenario had a major impact on the true prevalence in the normal hygiene setting, but less so when the hygiene was poor. The net revenue is not greatly affected by the test-strategy, because of the general variation in net revenues between farms. An exception to this is the low hygiene herd, where...

  7. Nondestructive characterization of ductile cast iron by magnetic adaptive testing

    Czech Academy of Sciences Publication Activity Database

    Vértesy, G.; Uchimoto, T.; Tomáš, Ivan; Takagi, T.

    2010-01-01

    Roč. 322, č. 20 (2010), s. 3117-3121 ISSN 0304-8853 R&D Projects: GA ČR GA101/09/1323; GA AV ČR 1QS100100508 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic NDE * magnetic adaptive testing * magnetic hysteresis * cast iron Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.689, year: 2010

  8. BRBN-T validation: adaptation of the Selective Reminding Test and Word List Generation

    Directory of Open Access Journals (Sweden)

    Mariana Rigueiro Neves

    2015-10-01

    Full Text Available Objective This study aims to present the Selective Reminding Test(SRT and Word List Generation (WLG adaptation to the Portuguese population, within the validation of the Brief Repeatable Battery of Neuropsychological Tests (BRBN-Tfor multiple sclerosis (MS patients.Method 66 healthy participants (54.5% female recruited from the community volunteered to participate in this study.Results A combination of procedures from Classical Test Theory (CTT and Item Response Theory (ITR were applied to item analysis and selection. For each SRT list, 12 words were selected and 3 letters were chosen for WLG to constitute the final versions of these tests for the Portuguese population.Conclusion The combination of CTT and ITR maximized the decision making process in the adaptation of the SRT and WLG to a different culture and language (Portuguese. The relevance of this study lies on the production of reliable standardized neuropsychological tests, so that they can be used to facilitate a more rigorous monitoring of the evolution of MS, as well as any therapeutic effects and cognitive rehabilitation.

  9. Computer aided instrumented Charpy test applied dynamic fracture toughness evaluation system

    International Nuclear Information System (INIS)

    Kobayashi, Toshiro; Niinomi, Mitsuo

    1986-01-01

    Micro computer aided data treatment system and personal computer aided data analysis system were applied to the traditional instrumented Charpy impact test system. The analysis of Charpy absorbed energy (E i , E p , E t ) and load (P y , P m ), and the evaluation of dynamic toughness through whole fracture process, i.e. J Id , J R curve and T mat was examined using newly developed computer aided instrumented Charpy impact test system. E i , E p , E t , P y and P m were effectively analyzed using moving average method and printed out automatically by micro computer aided data treatment system. J Id , J R curve and T mat could be measured by stop block test method. Then, J Id , J R curve and T mat were effectively estimated using compliance changing rate method and key curve method on the load-load point displacement curve of single fatigue cracked specimen by personal computer aided data analysis system. (author)

  10. Adapting a receptive vocabulary test for preschool-aged Greek-speaking children.

    Science.gov (United States)

    Okalidou, Areti; Syrika, Asimina; Beckman, Mary E; Edwards, Jan R

    2011-01-01

    Receptive vocabulary is an important measure for language evaluations. Therefore, norm-referenced receptive vocabulary tests are widely used in several languages. However, a receptive vocabulary test has not yet been normed for Modern Greek. To adapt an American English vocabulary test, the Receptive One-Word Picture Vocabulary Test-II (ROWPVT-II), for Modern Greek for use with Greek-speaking preschool children. The list of 170 English words on ROWPVT-II was adapted by (1) developing two lists (A and B) of Greek words that would match either the target English word or another concept corresponding to one of the pictured objects in the four-picture array; and (2) determining a developmental order for the chosen Greek words for preschool-aged children. For the first task, adult word frequency measures were used to select the words for the Greek wordlist. For the second task, 427 children, 225 boys and 202 girls, ranging in age from 2;0 years to 5;11 years, were recruited from urban and suburban areas of Greece. A pilot study of the two word lists was performed with the aim of comparing an equal number of list A and list B responses for each age group and deriving a new developmental list order. The relative difficulty of each Greek word item, that is, its accuracy score, was calculated by taking the average proportion of correct responses across ages for that word. Subsequently, the word accuracy scores in the two lists were compared via regression analysis, which yielded a highly significant relationship (R(2) = 0.97; p word item from the two lists was a better fit. Finally, new starting levels (basals) were established for preschool ages. The revised word list can serve as the basis for adapting a receptive vocabulary test for Greek preschool-aged children. Further steps need to be taken when testing larger numbers of 2;0 to 5;11-year-old children on the revised word list for determination of norms. This effort will facilitate early identification and remediation

  11. An evaluation of computerized adaptive testing for general psychological distress: combining GHQ-12 and Affectometer-2 in an item bank for public mental health research.

    Science.gov (United States)

    Stochl, Jan; Böhnke, Jan R; Pickett, Kate E; Croudace, Tim J

    2016-05-20

    Recent developments in psychometric modeling and technology allow pooling well-validated items from existing instruments into larger item banks and their deployment through methods of computerized adaptive testing (CAT). Use of item response theory-based bifactor methods and integrative data analysis overcomes barriers in cross-instrument comparison. This paper presents the joint calibration of an item bank for researchers keen to investigate population variations in general psychological distress (GPD). Multidimensional item response theory was used on existing health survey data from the Scottish Health Education Population Survey (n = 766) to calibrate an item bank consisting of pooled items from the short common mental disorder screen (GHQ-12) and the Affectometer-2 (a measure of "general happiness"). Computer simulation was used to evaluate usefulness and efficacy of its adaptive administration. A bifactor model capturing variation across a continuum of population distress (while controlling for artefacts due to item wording) was supported. The numbers of items for different required reliabilities in adaptive administration demonstrated promising efficacy of the proposed item bank. Psychometric modeling of the common dimension captured by more than one instrument offers the potential of adaptive testing for GPD using individually sequenced combinations of existing survey items. The potential for linking other item sets with alternative candidate measures of positive mental health is discussed since an optimal item bank may require even more items than these.

  12. COMPUTATION FORMAT computer codes X4TOC4 and PLOTC4. Implementing and Testing on a Personal Computer

    International Nuclear Information System (INIS)

    McLaughlin, P.K.

    1987-05-01

    This document describes the contents of the diskette containing the COMPUTATION FORMAT codes X4TOC4 and PLOTC4 by D.E. Cullen, and example data for use in implementing and testing these codes on a Personal Computer of the type IBM-PC/AT. Upon request the codes are available from the IAEA Nuclear Data Section, free of charge, on a single diskette. (author)

  13. A multiple objective test assembly approach for exposure control problems in Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    Theo J.H.M. Eggen

    2010-01-01

    Full Text Available Overexposure and underexposure of items in the bank are serious problems in operational computerized adaptive testing (CAT systems. These exposure problems might result in item compromise, or point at a waste of investments. The exposure control problem can be viewed as a test assembly problem with multiple objectives. Information in the test has to be maximized, item compromise has to be minimized, and pool usage has to be optimized. In this paper, a multiple objectives method is developed to deal with both types of exposure problems. In this method, exposure control parameters based on observed exposure rates are implemented as weights for the information in the item selection procedure. The method does not need time consuming simulation studies, and it can be implemented conditional on ability level. The method is compared with Sympson Hetter method for exposure control, with the Progressive method and with alphastratified testing. The results show that the method is successful in dealing with both kinds of exposure problems.

  14. Development and testing of methods for adaptive image processing in odontology and medicine

    International Nuclear Information System (INIS)

    Sund, Torbjoern

    2005-01-01

    Medical diagnostic imaging has undergone radical changes during the last ten years. In the early 1990'ies, the medical imaging department was almost exclusively film-based. Today, all major hospitals have converted to digital acquisition and handling of their diagnostic imaging, or are in the process of conversion. It is therefore important to investigate whether diagnostic reading of digitally acquired images on computer display screens can match or even surpass film recording and viewing. At the same time, the digitalisation opens new possibilities for image processing, which may challenge the traditional way of studying medical images. The current work explores some of the possibilities of digital processing techniques, and evaluates the results both by quantitative methods (ROC analysis) and by subjective qualification by real users. Summary of papers: Paper I: Locally adaptive image binarization with a sliding window threshold was used for the detection of bone ridges in radiotherapy portal images. A new thresholding criterion suitable for incremental update within the sliding window was developed, and it was shown that the algorithm gave better results on difficult portal images than various publicly available adaptive thresholding routines. For small windows the routine was also faster than an adaptive implementation of the Otsu algorithm that uses interpolation between fixed tiles, and the resulting images had equal quality. Paper II: It was investigated whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization could enhance the diagnostic quality of intra-oral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was first

  15. Development and testing of methods for adaptive image processing in odontology and medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sund, Torbjoern

    2005-07-01

    Medical diagnostic imaging has undergone radical changes during the last ten years. In the early 1990'ies, the medical imaging department was almost exclusively film-based. Today, all major hospitals have converted to digital acquisition and handling of their diagnostic imaging, or are in the process of conversion. It is therefore important to investigate whether diagnostic reading of digitally acquired images on computer display screens can match or even surpass film recording and viewing. At the same time, the digitalisation opens new possibilities for image processing, which may challenge the traditional way of studying medical images. The current work explores some of the possibilities of digital processing techniques, and evaluates the results both by quantitative methods (ROC analysis) and by subjective qualification by real users. Summary of papers: Paper I: Locally adaptive image binarization with a sliding window threshold was used for the detection of bone ridges in radiotherapy portal images. A new thresholding criterion suitable for incremental update within the sliding window was developed, and it was shown that the algorithm gave better results on difficult portal images than various publicly available adaptive thresholding routines. For small windows the routine was also faster than an adaptive implementation of the Otsu algorithm that uses interpolation between fixed tiles, and the resulting images had equal quality. Paper II: It was investigated whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization could enhance the diagnostic quality of intra-oral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was

  16. Communicative Language Testing: Implications for Computer Based Language Testing in French for Specific Purposes

    Science.gov (United States)

    García Laborda, Jesús; López Santiago, Mercedes; Otero de Juan, Nuria; Álvarez Álvarez, Alfredo

    2014-01-01

    Current evolutions of language testing have led to integrating computers in FSP assessments both in oral and written communicative tasks. This paper deals with two main issues: learners' expectations about the types of questions in FSP computer based assessments and the relation with their own experience. This paper describes the experience of 23…

  17. Tutoring system for nondestructive testing using computer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Koo; Koh, Sung Nam [Joong Ang Inspection Co.,Ltd., Seoul (Korea, Republic of); Shim, Yun Ju; Kim, Min Koo [Dept. of Computer Engineering, Aju University, Suwon (Korea, Republic of)

    1997-10-15

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  18. Tutoring system for nondestructive testing using computer

    International Nuclear Information System (INIS)

    Kim, Jin Koo; Koh, Sung Nam; Shim, Yun Ju; Kim, Min Koo

    1997-01-01

    This paper is written to introduce a multimedia tutoring system for nondestructive testing using personal computer. Nondestructive testing, one of the chief methods for inspecting welds and many other components, is very difficult for the NDT inspectors to understand its technical basis without a wide experience. And it is necessary for considerable repeated education and training for keeping their knowledge. The tutoring system that can simulate NDT works is suggested to solve the above problem based on reasonable condition. The tutoring system shows basic theories of nondestructive testing in a book-style with video images and hyper-links, and it offers practices, in which users can simulate the testing equipment. The book-style and simulation practices provide effective and individual environments for learning nondestructive testing.

  19. Computational identification of adaptive mutants using the VERT system

    Directory of Open Access Journals (Sweden)

    Winkler James

    2012-04-01

    Full Text Available Background Evolutionary dynamics of microbial organisms can now be visualized using the Visualizing Evolution in Real Time (VERT system, in which several isogenic strains expressing different fluorescent proteins compete during adaptive evolution and are tracked using fluorescent cell sorting to construct a population history over time. Mutations conferring enhanced growth rates can be detected by observing changes in the fluorescent population proportions. Results Using data obtained from several VERT experiments, we construct a hidden Markov-derived model to detect these adaptive events in VERT experiments without external intervention beyond initial training. Analysis of annotated data revealed that the model achieves consensus with human annotation for 85-93% of the data points when detecting adaptive events. A method to determine the optimal time point to isolate adaptive mutants is also introduced. Conclusions The developed model offers a new way to monitor adaptive evolution experiments without the need for external intervention, thereby simplifying adaptive evolution efforts relying on population tracking. Future efforts to construct a fully automated system to isolate adaptive mutants may find the algorithm a useful tool.

  20. Adaptive Radiotherapy Planning on Decreasing Gross Tumor Volumes as Seen on Megavoltage Computed Tomography Images

    International Nuclear Information System (INIS)

    Woodford, Curtis; Yartsev, Slav; Dar, A. Rashid; Bauman, Glenn; Van Dyk, Jake

    2007-01-01

    Purpose: To evaluate gross tumor volume (GTV) changes for patients with non-small-cell lung cancer by using daily megavoltage (MV) computed tomography (CT) studies acquired before each treatment fraction on helical tomotherapy and to relate the potential benefit of adaptive image-guided radiotherapy to changes in GTV. Methods and Materials: Seventeen patients were prescribed 30 fractions of radiotherapy on helical tomotherapy for non-small-cell lung cancer at London Regional Cancer Program from Dec 2005 to March 2007. The GTV was contoured on the daily MVCT studies of each patient. Adapted plans were created using merged MVCT-kilovoltage CT image sets to investigate the advantages of replanning for patients with differing GTV regression characteristics. Results: Average GTV change observed over 30 fractions was -38%, ranging from -12 to -87%. No significant correlation was observed between GTV change and patient's physical or tumor features. Patterns of GTV changes in the 17 patients could be divided broadly into three groups with distinctive potential for benefit from adaptive planning. Conclusions: Changes in GTV are difficult to predict quantitatively based on patient or tumor characteristics. If changes occur, there are points in time during the treatment course when it may be appropriate to adapt the plan to improve sparing of normal tissues. If GTV decreases by greater than 30% at any point in the first 20 fractions of treatment, adaptive planning is appropriate to further improve the therapeutic ratio

  1. Some thermalhydraulics of closure head adapters in a 3 loops PWR

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, F.; Daubert, O.; Hecker, M. [EDF/DER/National Hydraulics Laboratory, Chatou (France)] [and others

    1995-09-01

    In 1993 a R&D action, based on numerical simulations and experiments on PWR`s upper head was initiated. This paper presents the test facility TRAVERSIN (a scale model of a 900 MW PWR adapter) and the calculations performed on the geometry of different upper head sections with the Thermalhydraulic Finite Element Code N3S used for 2D and 3D computations. The paper presents the method followed to bring the adapter and upper head study to a successful conclusion. Two complementary approaches are performed to obtain global results on complete fluid flow in the upper head and local results on the flow around the adapters of closure head. A validation test case of these experimental and numerical tools is also presented.

  2. An Adaptive Genetic Association Test Using Double Kernel Machines.

    Science.gov (United States)

    Zhan, Xiang; Epstein, Michael P; Ghosh, Debashis

    2015-10-01

    Recently, gene set-based approaches have become very popular in gene expression profiling studies for assessing how genetic variants are related to disease outcomes. Since most genes are not differentially expressed, existing pathway tests considering all genes within a pathway suffer from considerable noise and power loss. Moreover, for a differentially expressed pathway, it is of interest to select important genes that drive the effect of the pathway. In this article, we propose an adaptive association test using double kernel machines (DKM), which can both select important genes within the pathway as well as test for the overall genetic pathway effect. This DKM procedure first uses the garrote kernel machines (GKM) test for the purposes of subset selection and then the least squares kernel machine (LSKM) test for testing the effect of the subset of genes. An appealing feature of the kernel machine framework is that it can provide a flexible and unified method for multi-dimensional modeling of the genetic pathway effect allowing for both parametric and nonparametric components. This DKM approach is illustrated with application to simulated data as well as to data from a neuroimaging genetics study.

  3. Improving the utility of the fine motor skills subscale of the comprehensive developmental inventory for infants and toddlers: a computerized adaptive test.

    Science.gov (United States)

    Huang, Chien-Yu; Tung, Li-Chen; Chou, Yeh-Tai; Chou, Willy; Chen, Kuan-Lin; Hsieh, Ching-Lin

    2017-07-27

    This study aimed at improving the utility of the fine motor subscale of the comprehensive developmental inventory for infants and toddlers (CDIIT) by developing a computerized adaptive test of fine motor skills. We built an item bank for the computerized adaptive test of fine motor skills using the fine motor subscale of the CDIIT items fitting the Rasch model. We also examined the psychometric properties and efficiency of the computerized adaptive test of fine motor skills with simulated computerized adaptive tests. Data from 1742 children with suspected developmental delays were retrieved. The mean scores of the fine motor subscale of the CDIIT increased along with age groups (mean scores = 1.36-36.97). The computerized adaptive test of fine motor skills contains 31 items meeting the Rasch model's assumptions (infit mean square = 0.57-1.21, outfit mean square = 0.11-1.17). For children of 6-71 months, the computerized adaptive test of fine motor skills had high Rasch person reliability (average reliability >0.90), high concurrent validity (rs = 0.67-0.99), adequate to excellent diagnostic accuracy (area under receiver operating characteristic = 0.71-1.00), and large responsiveness (effect size = 1.05-3.93). The computerized adaptive test of fine motor skills used 48-84% fewer items than the fine motor subscale of the CDIIT. The computerized adaptive test of fine motor skills used fewer items for assessment but was as reliable and valid as the fine motor subscale of the CDIIT. Implications for Rehabilitation We developed a computerized adaptive test based on the comprehensive developmental inventory for infants and toddlers (CDIIT) for assessing fine motor skills. The computerized adaptive test has been shown to be efficient because it uses fewer items than the original measure and automatically presents the results right after the test is completed. The computerized adaptive test is as reliable and valid as the CDIIT.

  4. Adaptation of a T3-uptake test and of radioimmunoassays for serum digoxin, thyroxine, and triiodothyronine to an automated radioimmunoassay system: ''Centria''

    International Nuclear Information System (INIS)

    Ertingshausen, G.; Shapiro, S.I.; Green, G.; Zborowski, G.

    1975-01-01

    We report the adaptation of four radioassays to the prototype of an automated radioimmunoassay system (''Centria,'' Union Carbide). The system consists of three integrated modules: an automated pipettor, which dispenses samples and reagents; the key module, an incubator/separator, in which centrifugal force is used to initiate and terminate multiple radioassay incubations and separations simultaneously; and a gamma-counter/computer, which counts three tubes simultaneously and converts counts into concentration units. Radioimmunoassays for thyroxine, triiodothyronine, and digoxin were developed with use of well-characterized antibodies and of prepackaged Sephadex-containing columns to separate bound and free radioactive ligand. A triiodothyronine-uptake test in which the same kind of columns were used was also adapted to the instrument. Results for clinical samples compared favorably with those obtained by manual procedures. We report data on correlation between different methods and preliminary data on precision of the prototype system

  5. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    Science.gov (United States)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  6. Optimizing the Use of Response Times for Item Selection in Computerized Adaptive Testing

    Science.gov (United States)

    Choe, Edison M.; Kern, Justin L.; Chang, Hua-Hua

    2018-01-01

    Despite common operationalization, measurement efficiency of computerized adaptive testing should not only be assessed in terms of the number of items administered but also the time it takes to complete the test. To this end, a recent study introduced a novel item selection criterion that maximizes Fisher information per unit of expected response…

  7. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  8. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L. [Oak Ridge National Lab., TN (United States); Sartori, E. [OCDE/OECD NEA Data Bank, Issy-les-Moulineaux (France); Viedma, L.G. de [Consejo de Seguridad Nuclear, Madrid (Spain)

    1997-06-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee`s Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community`s computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management.

  9. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    International Nuclear Information System (INIS)

    Kirk, B.L.; Sartori, E.; Viedma, L.G. de

    1997-01-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee's Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community's computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management

  10. Individualized evaluation of cholinesterase inhibitors effects in dementia with adaptive cognitive testing

    NARCIS (Netherlands)

    Wouters, Hans; van Campen, Jos P. C. M.; Appels, Bregje A.; Beijnen, Jos H.; Zwinderman, Aeilko H.; van Gool, Willem A.; Schmand, Ben

    2016-01-01

    Computerized Adaptive Testing (CAT) of cognitive function, selects for every individual patient, only items of appropriate difficulty to estimate his or her level of cognitive impairment. Therefore, CAT has the potential to combine brevity with precision. We retrospectively examined the evaluation

  11. Individualized evaluation of cholinesterase inhibitors effects in dementia with adaptive cognitive testing

    NARCIS (Netherlands)

    Wouters, Hans; Van Campen, Jos P C M; Appels, Bregje A; Beijnen, Jos H; Zwinderman, Aeilko H; Van Gool, Willem A; Schmand, Ben

    2015-01-01

    Computerized Adaptive Testing (CAT) of cognitive function, selects for every individual patient, only items of appropriate difficulty to estimate his or her level of cognitive impairment. Therefore, CAT has the potential to combine brevity with precision. We retrospectively examined the evaluation

  12. Individualized evaluation of cholinesterase inhibitors effects in dementia with adaptive cognitive testing

    NARCIS (Netherlands)

    Wouters, H.; van Campen, J.P.C.M.; Appels, B.A.; Beijnen, J.H.; Zwinderman, A.H.; van Gool, W.A.; Schmand, B.

    Computerized Adaptive Testing (CAT) of cognitive function, selects for every individual patient, only items of appropriate difficulty to estimate his or her level of cognitive impairment. Therefore, CAT has the potential to combine brevity with precision. We retrospectively examined the evaluation

  13. A comparison of computerized adaptive testing and fixed-length short forms for the Prosthetic Limb Users Survey of Mobility (PLUS-MTM).

    Science.gov (United States)

    Amtmann, Dagmar; Bamer, Alyssa M; Kim, Jiseon; Bocell, Fraser; Chung, Hyewon; Park, Ryoungsun; Salem, Rana; Hafner, Brian J

    2017-09-01

    New health status instruments can be administered by computerized adaptive test or short forms. The Prosthetic Limb Users Survey of Mobility (PLUS-M TM ) is a self-report measure of mobility for prosthesis users with lower limb loss. This study used the PLUS-M to examine advantages and disadvantages of computerized adaptive test and short forms. To compare scores obtained from computerized adaptive test to scores obtained from fixed-length short forms (7-item and 12-item) in order to provide guidance to researchers and clinicians on how to select the best form of administration for different uses. Cross-sectional, observational study. Individuals with lower limb loss completed the PLUS-M by computerized adaptive test and short forms. Administration time, correlations between the scores, and standard errors were compared. Scores and standard errors from the computerized adaptive test, 7-item short form, and 12-item short form were highly correlated and all forms of administration were efficient. Computerized adaptive test required less time to administer than either paper or electronic short forms; however, time savings were minimal compared to the 7-item short form. Results indicate that the PLUS-M computerized adaptive test is most efficient, and differences in scores between administration methods are minimal. The main advantage of the computerized adaptive test was more reliable scores at higher levels of mobility compared to short forms. Clinical relevance Health-related item banks, like the Prosthetic Limb Users Survey of Mobility (PLUS-M TM ), can be administered by computerized adaptive testing (CAT) or as fixed-length short forms (SFs). Results of this study will help clinicians and researchers decide whether they should invest in a CAT administration system or whether SFs are more appropriate.

  14. A universal electronical adaptation of automats for biochemical analysis to a central processing computer by applying CAMAC-signals

    International Nuclear Information System (INIS)

    Schaefer, R.

    1975-01-01

    A universal expansion of a CAMAC-subsystem - BORER 3000 - for adapting analysis instruments in biochemistry to a processing computer is described. The possibility of standardizing input interfaces for lab instruments with such circuits is discussed and the advantages achieved by applying the CAMAC-specifications are described

  15. Computational hydrodynamics and optical performance of inductively-coupled plasma adaptive lenses

    Energy Technology Data Exchange (ETDEWEB)

    Mortazavi, M.; Urzay, J., E-mail: jurzay@stanford.edu; Mani, A. [Center for Turbulence Research, Stanford University, Stanford, California 94305-3024 (United States)

    2015-06-15

    This study addresses the optical performance of a plasma adaptive lens for aero-optical applications by using both axisymmetric and three-dimensional numerical simulations. Plasma adaptive lenses are based on the effects of free electrons on the phase velocity of incident light, which, in theory, can be used as a phase-conjugation mechanism. A closed cylindrical chamber filled with Argon plasma is used as a model lens into which a beam of light is launched. The plasma is sustained by applying a radio-frequency electric current through a coil that envelops the chamber. Four different operating conditions, ranging from low to high powers and induction frequencies, are employed in the simulations. The numerical simulations reveal complex hydrodynamic phenomena related to buoyant and electromagnetic laminar transport, which generate, respectively, large recirculating cells and wall-normal compression stresses in the form of local stagnation-point flows. In the axisymmetric simulations, the plasma motion is coupled with near-wall axial striations in the electron-density field, some of which propagate in the form of low-frequency traveling disturbances adjacent to vortical quadrupoles that are reminiscent of Taylor-Görtler flow structures in centrifugally unstable flows. Although the refractive-index fields obtained from axisymmetric simulations lead to smooth beam wavefronts, they are found to be unstable to azimuthal disturbances in three of the four three-dimensional cases considered. The azimuthal striations are optically detrimental, since they produce high-order angular aberrations that account for most of the beam wavefront error. A fourth case is computed at high input power and high induction frequency, which displays the best optical properties among all the three-dimensional simulations considered. In particular, the increase in induction frequency prevents local thermalization and leads to an axisymmetric distribution of electrons even after introduction of

  16. Computer Aided Education System SuperTest. Present and Prospective

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available This paper analyzes the testing and self-testing process for the Computer Aided Education System (CAES SuperTest, used at the Academy of Economic Studies of Chisinau, Moldova and recently implemented at the University of Bacau, Romania. We discuss here the future of this software, from the Information Society and Knowledge Society point of view.

  17. ADAPTIVE E-LEARNING AND ITS EVALUATION

    Directory of Open Access Journals (Sweden)

    KOSTOLÁNYOVÁ, Katerina

    2012-12-01

    Full Text Available This paper introduces a complex plan for a complete system of individualized electronic instruction. The core of the system is a computer program to control teaching, the so called “virtual teacher”. The virtual teacher automatically adapts to individual student’s characteristics and their learning style. It adapts to static as well as to dynamic characteristics of the student. To manage all this it needs a database of various styles and forms of teaching as well as a sufficient amount of information about the learning style, type of memory and other characteristics of the student. The information about these characteristics, the structure of data storage and its use by the virtual teacher are also part of this paper. We also outline a methodology of adaptive study materials. We define basic rules and forms to create adaptive study materials. This adaptive e-learning system was pilot tested in learning of more than 50 students. These students filled in a learning style questionnaire at the beginning of the study and they had the option to fill in an adaptive evaluation questionnaire at the end of the study. Results of these questionnaires were analyzed. Several conclusions were concluded from this analysis to alter the methodology of adaptive study materials.

  18. Development and test validation of a computational scheme for high-fidelity fluence estimations of the Swiss BWRs

    International Nuclear Information System (INIS)

    Vasiliev, A.; Wieselquist, W.; Ferroukhi, H.; Canepa, S.; Heldt, J.; Ledergerber, G.

    2011-01-01

    One of the current objectives within reactor analysis related projects at the Paul Scherrer Institut is the establishment of a comprehensive computational methodology for fast neutron fluence (FNF) estimations of reactor pressure vessels (RPV) and internals for both PWRs and BWRs. In the recent past, such an integral calculational methodology based on the CASMO-4/SIMULATE- 3/MCNPX system of codes was developed for PWRs and validated against RPV scraping tests. Based on the very satisfactory validation results, the methodology was recently applied for predictive FNF evaluations of a Swiss PWR to support the national nuclear safety inspectorate in the framework of life-time estimations. Today, focus is at PSI given to develop a corresponding advanced methodology for high-fidelity FNF estimations of BWR reactors. In this paper, the preliminary steps undertaken in that direction are presented. To start, the concepts of the PWR computational scheme and its transfer/adaptation to BWR are outlined. Then, the modelling of a Swiss BWR characterized by very heterogeneous core designs is presented along with preliminary sensitivity studies carried out to assess the sufficient level of details required for the complex core region. Finally, a first validation test case is presented on the basis of two dosimeter monitors irradiated during two recent cycles of the given BWR reactor. The achieved computational results show a satisfactory agreement with measured dosimeter data and illustrate thereby the feasibility of applying the PSI FNF computational scheme also for BWRs. Further sensitivity/optimization studies are nevertheless necessary in order to consolidate the scheme and to ensure increasing continuously, the fidelity and reliability of the BWR FNF estimations. (author)

  19. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H; von Schwerin, E; Szepessy, A; Tempone, Raul

    2011-01-01

    . An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates

  20. Translation, Cultural Adaptation and Validation of the Simple Shoulder Test to Spanish

    OpenAIRE

    Arcuri, Francisco; Barclay, Fernando; Nacul, Ivan

    2015-01-01

    Background: The validation of widely used scales facilitates the comparison across international patient samples. Objective: The objective was to translate, culturally adapt and validate the Simple Shoulder Test into Argentinian Spanish. Methods: The Simple Shoulder Test was translated from English into Argentinian Spanish by two independent translators, translated back into English and evaluated for accuracy by an expert committee to correct the possible discrepancies. It was then administer...

  1. DEVELOPMENT AND ADAPTATION OF VORTEX REALIZABLE MEASUREMENT SYSTEM FOR BENCHMARK TEST WITH LARGE SCALE MODEL OF NUCLEAR REACTOR

    Directory of Open Access Journals (Sweden)

    S. M. Dmitriev

    2017-01-01

    Full Text Available The last decades development of applied calculation methods of nuclear reactor thermal and hydraulic processes are marked by the rapid growth of the High Performance Computing (HPC, which contribute to the active introduction of Computational Fluid Dynamics (CFD. The use of such programs to justify technical and economic parameters and especially the safety of nuclear reactors requires comprehensive verification of mathematical models and CFD programs. The aim of the work was the development and adaptation of a measuring system having the characteristics necessary for its application in the verification test (experimental facility. It’s main objective is to study the processes of coolant flow mixing with different physical properties (for example, the concentration of dissolved impurities inside a large-scale reactor model. The basic method used for registration of the spatial concentration field in the mixing area is the method of spatial conductometry. In the course of the work, a measurement complex, including spatial conductometric sensors, a system of secondary converters and software, was created. Methods of calibration and normalization of measurement results are developed. Averaged concentration fields, nonstationary realizations of the measured local conductivity were obtained during the first experimental series, spectral and statistical analysis of the realizations were carried out.The acquired data are compared with pretest CFD-calculations performed in the ANSYS CFX program. A joint analysis of the obtained results made it possible to identify the main regularities of the process under study, and to demonstrate the capabilities of the designed measuring system to receive the experimental data of the «CFD-quality» required for verification.The carried out adaptation of spatial sensors allows to conduct a more extensive program of experimental tests, on the basis of which a databank and necessary generalizations will be created

  2. Functional Task Test: 3. Skeletal Muscle Performance Adaptations to Space Flight

    Science.gov (United States)

    Ryder, Jeffrey W.; Wickwire, P. J.; Buxton, R. E.; Bloomberg, J. J.; Ploutz-Snyder, L.

    2011-01-01

    The functional task test is a multi-disciplinary study investigating how space-flight induced changes to physiological systems impacts functional task performance. Impairment of neuromuscular function would be expected to negatively affect functional performance of crewmembers following exposure to microgravity. This presentation reports the results for muscle performance testing in crewmembers. Functional task performance will be presented in the abstract "Functional Task Test 1: sensory motor adaptations associated with postflight alternations in astronaut functional task performance." METHODS: Muscle performance measures were obtained in crewmembers before and after short-duration space flight aboard the Space Shuttle and long-duration International Space Station (ISS) missions. The battery of muscle performance tests included leg press and bench press measures of isometric force, isotonic power and total work. Knee extension was used for the measurement of central activation and maximal isometric force. Upper and lower body force steadiness control were measured on the bench press and knee extension machine, respectively. Tests were implemented 60 and 30 days before launch, on landing day (Shuttle crew only), and 6, 10 and 30 days after landing. Seven Space Shuttle crew and four ISS crew have completed the muscle performance testing to date. RESULTS: Preliminary results for Space Shuttle crew reveal significant reductions in the leg press performance metrics of maximal isometric force, power and total work on R+0 (pperformance metrics were observed in returning Shuttle crew and these adaptations are likely contributors to impaired functional tasks that are ambulatory in nature (See abstract Functional Task Test: 1). Interestingly, no significant changes in central activation capacity were detected. Therefore, impairments in muscle function in response to short-duration space flight are likely myocellular rather than neuromotor in nature.

  3. Evolution of a Computer-Based Testing Laboratory

    Science.gov (United States)

    Moskal, Patrick; Caldwell, Richard; Ellis, Taylor

    2009-01-01

    In 2003, faced with increasing growth in technology-based and large-enrollment courses, the College of Business Administration at the University of Central Florida opened a computer-based testing lab to facilitate administration of course examinations. Patrick Moskal, Richard Caldwell, and Taylor Ellis describe the development and evolution of the…

  4. Overview and current management of computerized adaptive testing in licensing/certification examinations

    Directory of Open Access Journals (Sweden)

    Dong Gi Seo

    2017-07-01

    Full Text Available Computerized adaptive testing (CAT has been implemented in high-stakes examinations such as the National Council Licensure Examination-Registered Nurses in the United States since 1994. Subsequently, the National Registry of Emergency Medical Technicians in the United States adopted CAT for certifying emergency medical technicians in 2007. This was done with the goal of introducing the implementation of CAT for medical health licensing examinations. Most implementations of CAT are based on item response theory, which hypothesizes that both the examinee and items have their own characteristics that do not change. There are 5 steps for implementing CAT: first, determining whether the CAT approach is feasible for a given testing program; second, establishing an item bank; third, pretesting, calibrating, and linking item parameters via statistical analysis; fourth, determining the specification for the final CAT related to the 5 components of the CAT algorithm; and finally, deploying the final CAT after specifying all the necessary components. The 5 components of the CAT algorithm are as follows: item bank, starting item, item selection rule, scoring procedure, and termination criterion. CAT management includes content balancing, item analysis, item scoring, standard setting, practice analysis, and item bank updates. Remaining issues include the cost of constructing CAT platforms and deploying the computer technology required to build an item bank. In conclusion, in order to ensure more accurate estimations of examinees’ ability, CAT may be a good option for national licensing examinations. Measurement theory can support its implementation for high-stakes examinations.

  5. Water System Adaptation To Hydrological Changes: Module 7, Adaptation Principles and Considerations

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  6. Design and tests of an adaptive focusing neutron guide

    International Nuclear Information System (INIS)

    Valicu, Roxana Georgiana

    2012-01-01

    This work contains the Monte Carlo Simulations, as well as the first tests with an adaptive focusing neutron guide for creating a focus that does not depend on the wavelength of the incoming neutrons. All known neutron guides consist of a rectangular shape, built out of four glass plates. The inner side of the guide is coated with a complex structure of metal layers. This reflects and guides the neutrons (in analogy with the reflection of the light). For beam focusing neutron guides with fixed curvature can be built. For most experiments it is important that the beam is focused on to a small surface of the sample. In the case of focusing guides with fixed curvature it has been observed that the focusing (dimension and position of the beam focus) is wavelength dependent. This is why for measurements that are performed with different wavelengths it is very important to change the curvature of the neutron guide in order to obtain optimal results. In this work we have designed, constructed and tested a guide where we can change the curvature during the experiment. In this way we can obtain a variable curvature in horizontal as well as in vertical direction. For a curvature in the horizontal or vertical direction it is not necessary to move all four walls, only two of the opposed plates. The element that changes the curvature of the guide consists of an acting element (piezomotor) as well as a rod that can be operated by the piezomotor and that acts through a lever onto the plate. The action of a force and a consecutive torsion momentum at the free end of the plate changes the curvature of the whole plate in an almost parabolic way. Making use of the Monte Carlo simulations we were able to determine the optimal curvature for each wavelength of a neutron guide for the spectrometer TOFTOF installed at the Forschungsneutronenquelle Heinz Maier-Leibnitz (FRM II). First tests have shown that with an adaptive focusing guide one can gain up to a factor three in intensity at

  7. Design and tests of an adaptive focusing neutron guide

    Energy Technology Data Exchange (ETDEWEB)

    Valicu, Roxana Georgiana

    2012-08-23

    This work contains the Monte Carlo Simulations, as well as the first tests with an adaptive focusing neutron guide for creating a focus that does not depend on the wavelength of the incoming neutrons. All known neutron guides consist of a rectangular shape, built out of four glass plates. The inner side of the guide is coated with a complex structure of metal layers. This reflects and guides the neutrons (in analogy with the reflection of the light). For beam focusing neutron guides with fixed curvature can be built. For most experiments it is important that the beam is focused on to a small surface of the sample. In the case of focusing guides with fixed curvature it has been observed that the focusing (dimension and position of the beam focus) is wavelength dependent. This is why for measurements that are performed with different wavelengths it is very important to change the curvature of the neutron guide in order to obtain optimal results. In this work we have designed, constructed and tested a guide where we can change the curvature during the experiment. In this way we can obtain a variable curvature in horizontal as well as in vertical direction. For a curvature in the horizontal or vertical direction it is not necessary to move all four walls, only two of the opposed plates. The element that changes the curvature of the guide consists of an acting element (piezomotor) as well as a rod that can be operated by the piezomotor and that acts through a lever onto the plate. The action of a force and a consecutive torsion momentum at the free end of the plate changes the curvature of the whole plate in an almost parabolic way. Making use of the Monte Carlo simulations we were able to determine the optimal curvature for each wavelength of a neutron guide for the spectrometer TOFTOF installed at the Forschungsneutronenquelle Heinz Maier-Leibnitz (FRM II). First tests have shown that with an adaptive focusing guide one can gain up to a factor three in intensity at

  8. Adaptive algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Lu Wenkai; Yin Fangfang

    2004-01-01

    Algebraic reconstruction techniques (ART) are iterative procedures for reconstructing objects from their projections. It is proven that ART can be computationally efficient by carefully arranging the order in which the collected data are accessed during the reconstruction procedure and adaptively adjusting the relaxation parameters. In this paper, an adaptive algebraic reconstruction technique (AART), which adopts the same projection access scheme in multilevel scheme algebraic reconstruction technique (MLS-ART), is proposed. By introducing adaptive adjustment of the relaxation parameters during the reconstruction procedure, one-iteration AART can produce reconstructions with better quality, in comparison with one-iteration MLS-ART. Furthermore, AART outperforms MLS-ART with improved computational efficiency

  9. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    Science.gov (United States)

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  10. Adaptive solution of the multigroup diffusion equation on irregular structured grids using a conforming finite element method formulation

    International Nuclear Information System (INIS)

    Ragusa, J. C.

    2004-01-01

    In this paper, a method for performing spatially adaptive computations in the framework of multigroup diffusion on 2-D and 3-D Cartesian grids is investigated. The numerical error, intrinsic to any computer simulation of physical phenomena, is monitored through an a posteriori error estimator. In a posteriori analysis, the computed solution itself is used to assess the accuracy. By efficiently estimating the spatial error, the entire computational process is controlled through successively adapted grids. Our analysis is based on a finite element solution of the diffusion equation. Bilinear test functions are used. The derived a posteriori error estimator is therefore based on the Hessian of the numerical solution. (authors)

  11. Sustainable computational science

    DEFF Research Database (Denmark)

    Rougier, Nicolas; Hinsen, Konrad; Alexandre, Frédéric

    2017-01-01

    Computer science offers a large set of tools for prototyping, writing, running, testing, validating, sharing and reproducing results, however computational science lags behind. In the best case, authors may provide their source code as a compressed archive and they may feel confident their research...... workflows, in particular in peer-reviews. Existing journals have been slow to adapt: source codes are rarely requested, hardly ever actually executed to check that they produce the results advertised in the article. ReScience is a peer-reviewed journal that targets computational research and encourages...... the explicit replication of already published research, promoting new and open-source implementations in order to ensure that the original research can be replicated from its description. To achieve this goal, the whole publishing chain is radically different from other traditional scientific journals. ReScience...

  12. Some thermohydraulics of closure head adapters in a 3 loops PWR

    International Nuclear Information System (INIS)

    Hofmann, F.; Daubert, O.; Bertrand, C.; Hecker, M.; Arnoux-Guisse, F.; Bonnin, O.

    1995-12-01

    In 1993 a R and D action, based on numerical simulations and experiments on PWR's upper head was initiated. This paper presents the test facility TRAVERSIN (a scale model of a 900 MW adapter) and calculations performed on the geometry of different upper head sections with the Thermalhydraulic Finite Element Code N3S used for 2D and 3D computations. The paper presents the method followed to bring the adapter and upper head study to a successful conclusion. Two complementary approaches are performed to obtain global results on complete fluid flow in the upper head and local results on the flow around the adapters of closure head. A validation test case of these experimental and numerical tools is also presented. (authors). 7 refs., 9 figs., 1 tab

  13. Harmony Search Based Parameter Ensemble Adaptation for Differential Evolution

    Directory of Open Access Journals (Sweden)

    Rammohan Mallipeddi

    2013-01-01

    Full Text Available In differential evolution (DE algorithm, depending on the characteristics of the problem at hand and the available computational resources, different strategies combined with a different set of parameters may be effective. In addition, a single, well-tuned combination of strategies and parameters may not guarantee optimal performance because different strategies combined with different parameter settings can be appropriate during different stages of the evolution. Therefore, various adaptive/self-adaptive techniques have been proposed to adapt the DE strategies and parameters during the course of evolution. In this paper, we propose a new parameter adaptation technique for DE based on ensemble approach and harmony search algorithm (HS. In the proposed method, an ensemble of parameters is randomly sampled which form the initial harmony memory. The parameter ensemble evolves during the course of the optimization process by HS algorithm. Each parameter combination in the harmony memory is evaluated by testing them on the DE population. The performance of the proposed adaptation method is evaluated using two recently proposed strategies (DE/current-to-pbest/bin and DE/current-to-gr_best/bin as basic DE frameworks. Numerical results demonstrate the effectiveness of the proposed adaptation technique compared to the state-of-the-art DE based algorithms on a set of challenging test problems (CEC 2005.

  14. Design and Preliminary Testing of the International Docking Adapter's Peripheral Docking Target

    Science.gov (United States)

    Foster, Christopher W.; Blaschak, Johnathan; Eldridge, Erin A.; Brazzel, Jack P.; Spehar, Peter T.

    2015-01-01

    The International Docking Adapter's Peripheral Docking Target (PDT) was designed to allow a docking spacecraft to judge its alignment relative to the docking system. The PDT was designed to be compatible with relative sensors using visible cameras, thermal imagers, or Light Detection and Ranging (LIDAR) technologies. The conceptual design team tested prototype designs and materials to determine the contrast requirements for the features. This paper will discuss the design of the PDT, the methodology and results of the tests, and the conclusions pertaining to PDT design that were drawn from testing.

  15. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU

    Directory of Open Access Journals (Sweden)

    Hailong Xu

    2016-03-01

    Full Text Available Nowadays, software-defined radio (SDR has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP and Space-Frequency Adaptive Processing (SFAP are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications.

  16. Systematic Testing should not be a Topic in the Computer Science Curriculum!

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    2003-01-01

    In this paper we argue that treating "testing" as an isolated topic is a wrong approach in computer science and software engineering teaching. Instead testing should pervade practical topics and exercises in the computer science curriculum to teach students the importance of producing software...

  17. Repetitive Domain-Referenced Testing Using Computers: the TITA System.

    Science.gov (United States)

    Olympia, P. L., Jr.

    The TITA (Totally Interactive Testing and Analysis) System algorithm for the repetitive construction of domain-referenced tests utilizes a compact data bank, is highly portable, is useful in any discipline, requires modest computer hardware, and does not present a security problem. Clusters of related keyphrases, statement phrases, and distractors…

  18. Computer-aided system for interactive psychomotor testing

    Science.gov (United States)

    Selivanova, Karina G.; Ignashchuk, Olena V.; Koval, Leonid G.; Kilivnik, Volodymyr S.; Zlepko, Alexandra S.; Sawicki, Daniel; Kalizhanova, Aliya; Zhanpeisova, Aizhan; Smailova, Saule

    2017-08-01

    Nowadays research of psychomotor actions has taken a special place in education, sports, medicine, psychology etc. Development of computer system for psychomotor testing could help solve many operational problems in psychoneurology and psychophysiology and also determine the individual characteristics of fine motor skills. This is particularly relevant issue when it comes to children, students, athletes for definition of personal and professional features. The article presents the dynamics of a developing psychomotor skills and application in the training process of means. The results of testing indicated their significant impact on psychomotor skills development.

  19. Genetic Algorithms for Case Adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Salem, A M [Computer Science Dept, Faculty of Computer and Information Sciences, Ain Shams University, Cairo (Egypt); Mohamed, A H [Solid State Dept., (NCRRT), Cairo (Egypt)

    2008-07-01

    Case based reasoning (CBR) paradigm has been widely used to provide computer support for recalling and adapting known cases to novel situations. Case adaptation algorithms generally rely on knowledge based and heuristics in order to change the past solutions to solve new problems. However, case adaptation has always been a difficult process to engineers within (CBR) cycle. Its difficulties can be referred to its domain dependency; and computational cost. In an effort to solve this problem, this research explores a general-purpose method that applying a genetic algorithm (GA) to CBR adaptation. Therefore, it can decrease the computational complexity of the search space in the problems having a great dependency on their domain knowledge. The proposed model can be used to perform a variety of design tasks on a broad set of application domains. However, it has been implemented for the tablet formulation as a domain of application. The proposed system has improved the performance of the CBR design systems.

  20. Genetic Algorithms for Case Adaptation

    International Nuclear Information System (INIS)

    Salem, A.M.; Mohamed, A.H.

    2008-01-01

    Case based reasoning (CBR) paradigm has been widely used to provide computer support for recalling and adapting known cases to novel situations. Case adaptation algorithms generally rely on knowledge based and heuristics in order to change the past solutions to solve new problems. However, case adaptation has always been a difficult process to engineers within (CBR) cycle. Its difficulties can be referred to its domain dependency; and computational cost. In an effort to solve this problem, this research explores a general-purpose method that applying a genetic algorithm (GA) to CBR adaptation. Therefore, it can decrease the computational complexity of the search space in the problems having a great dependency on their domain knowledge. The proposed model can be used to perform a variety of design tasks on a broad set of application domains. However, it has been implemented for the tablet formulation as a domain of application. The proposed system has improved the performance of the CBR design systems

  1. Programs for Testing Processor-in-Memory Computing Systems

    Science.gov (United States)

    Katz, Daniel S.

    2006-01-01

    The Multithreaded Microbenchmarks for Processor-In-Memory (PIM) Compilers, Simulators, and Hardware are computer programs arranged in a series for use in testing the performances of PIM computing systems, including compilers, simulators, and hardware. The programs at the beginning of the series test basic functionality; the programs at subsequent positions in the series test increasingly complex functionality. The programs are intended to be used while designing a PIM system, and can be used to verify that compilers, simulators, and hardware work correctly. The programs can also be used to enable designers of these system components to examine tradeoffs in implementation. Finally, these programs can be run on non-PIM hardware (either single-threaded or multithreaded) using the POSIX pthreads standard to verify that the benchmarks themselves operate correctly. [POSIX (Portable Operating System Interface for UNIX) is a set of standards that define how programs and operating systems interact with each other. pthreads is a library of pre-emptive thread routines that comply with one of the POSIX standards.

  2. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  3. Adaptation, test-retest reliability, and construct validity of the Physical Activity Neighborhood Environment Scale in Nigeria (PANES-N).

    Science.gov (United States)

    Oyeyemi, Adewale L; Sallis, James F; Oyeyemi, Adetoyeje Y; Amin, Mariam M; De Bourdeaudhuij, Ilse; Deforche, Benedicte

    2013-11-01

    This study adapted the Physical Activity Neighborhood Environment Scale (PANES) to the Nigerian context and assessed the test-retest reliability and construct validity of the Nigerian version (PANESN). A multidisciplinary panel of experts adapted the original PANES to reflect the built and social environment of Nigeria. The adapted PANES was subjected to cognitive testing and test retest reliability in a diverse sample of Nigerian adults (N = 132) from different neighborhood types. Intraclass Correlation Coefficients (ICC) was used to assess test-retest reliability, and construct validity was investigated with Analysis of Covariance for differences in environmental attributes between neighborhoods. Four of the 17 items on the original PANES were significantly modified, 3 were removed and 2 new items were incorporated into the final version of adapted PANES-N. Test-retest reliability was substantial to almost perfect (ICC = 0.62-1.00) for all items on the PANES-N, and residents of neighborhoods in the inner city reported higher residential density, land use mix and safety, but lower pedestrian facilities and aesthetics than did residents of government reserved area/new layout neighborhoods. The PANES-N appears promising for assessing environmental perceptions related to physical activity in Nigeria, but further testing is required to assess its applicability across Africa.

  4. Using Artificial Intelligence to Control and Adapt Level of Difficulty in Computer Based, Cognitive Therapy – an Explorative Study

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    2011-01-01

    Prism Adaptation Therapy (PAT) is an intervention method in the treatment of the attention disorder neglect (Frassinetti, Angeli, Meneghello, Avanzi, & Ladavas, 2002; Rossetti, et al., 1998). The aim of this study was to investigate whether one session of PAT using a computer-attached touchscreen...

  5. Adaptive Management Plan for Sensitive Plant Species on the Nevada Test Site

    International Nuclear Information System (INIS)

    Wills, C. A.

    2001-01-01

    The Nevada Test Site supports numerous plant species considered sensitive because of their past or present status under the Endangered Species Act and with federal and state agencies. In 1998, the U.S. Department of Energy, Nevada Operation Office (DOE/NV) prepared a Resource Management Plan which commits to protects and conserve these sensitive plant species and to minimize accumulative impacts to them. This document presents the procedures of a long-term adaptive management plan which is meant to ensure that these goals are met. It identifies the parameters that are measured for all sensitive plant populations during long-term monitoring and the adaptive management actions which may be taken if significant threats to these populations are detected. This plan does not, however, identify the current list of sensitive plant species know to occur on the Nevada Test Site. The current species list and progress on their monitoring is reported annually by DOE/NV in the Resource Management Plan

  6. Digital computed radiography in industrial X-ray testing

    International Nuclear Information System (INIS)

    Osterloh, K.; Onel, Y.; Zscherpel, U.; Ewert, U.

    2001-01-01

    Computed radiography is used for X-ray testing in many industrial applications. There are different systems depending on the application, e.g. fast systems for detection of material inhomogeneities and slower systems with higher local resolution for detection of cracks and fine details, e.g. in highly stressed areas or in welded seams. The method is more dynamic than film methods, and digital image processing is possible during testing [de

  7. Adaptive statistical iterative reconstruction for volume-rendered computed tomography portovenography. Improvement of image quality

    International Nuclear Information System (INIS)

    Matsuda, Izuru; Hanaoka, Shohei; Akahane, Masaaki

    2010-01-01

    Adaptive statistical iterative reconstruction (ASIR) is a reconstruction technique for computed tomography (CT) that reduces image noise. The purpose of our study was to investigate whether ASIR improves the quality of volume-rendered (VR) CT portovenography. Institutional review board approval, with waived consent, was obtained. A total of 19 patients (12 men, 7 women; mean age 69.0 years; range 25-82 years) suspected of having liver lesions underwent three-phase enhanced CT. VR image sets were prepared with both the conventional method and ASIR. The required time to make VR images was recorded. Two radiologists performed independent qualitative evaluations of the image sets. The Wilcoxon signed-rank test was used for statistical analysis. Contrast-noise ratios (CNRs) of the portal and hepatic vein were also evaluated. Overall image quality was significantly improved by ASIR (P<0.0001 and P=0.0155 for each radiologist). ASIR enhanced CNRs of the portal and hepatic vein significantly (P<0.0001). The time required to create VR images was significantly shorter with ASIR (84.7 vs. 117.1 s; P=0.014). ASIR enhances CNRs and improves image quality in VR CT portovenography. It also shortens the time required to create liver VR CT portovenographs. (author)

  8. Development of a computer-adaptive physical function instrument for Social Security Administration disability determination.

    Science.gov (United States)

    Ni, Pengsheng; McDonough, Christine M; Jette, Alan M; Bogusz, Kara; Marfeo, Elizabeth E; Rasch, Elizabeth K; Brandt, Diane E; Meterko, Mark; Haley, Stephen M; Chan, Leighton

    2013-09-01

    To develop and test an instrument to assess physical function for Social Security Administration (SSA) disability programs, the SSA-Physical Function (SSA-PF) instrument. Item response theory (IRT) analyses were used to (1) create a calibrated item bank for each of the factors identified in prior factor analyses, (2) assess the fit of the items within each scale, (3) develop separate computer-adaptive testing (CAT) instruments for each scale, and (4) conduct initial psychometric testing. Cross-sectional data collection; IRT analyses; CAT simulation. Telephone and Internet survey. Two samples: SSA claimants (n=1017) and adults from the U.S. general population (n=999). None. Model fit statistics, correlation, and reliability coefficients. IRT analyses resulted in 5 unidimensional SSA-PF scales: Changing & Maintaining Body Position, Whole Body Mobility, Upper Body Function, Upper Extremity Fine Motor, and Wheelchair Mobility for a total of 102 items. High CAT accuracy was demonstrated by strong correlations between simulated CAT scores and those from the full item banks. On comparing the simulated CATs with the full item banks, very little loss of reliability or precision was noted, except at the lower and upper ranges of each scale. No difference in response patterns by age or sex was noted. The distributions of claimant scores were shifted to the lower end of each scale compared with those of a sample of U.S. adults. The SSA-PF instrument contributes important new methodology for measuring the physical function of adults applying to the SSA disability programs. Initial evaluation revealed that the SSA-PF instrument achieved considerable breadth of coverage in each content domain and demonstrated noteworthy psychometric properties. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. Adaptive Distributed Data Structure Management for Parallel CFD Applications

    KAUST Repository

    Frisch, Jerome

    2013-09-01

    Computational fluid dynamics (CFD) simulations require a lot of computing resources in terms of CPU time and memory in order to compute with a reasonable physical accuracy. If only uniformly refined domains are applied, the amount of computing cells is growing rather fast if a certain small resolution is physically required. This can be remedied by applying adaptively refined grids. Unfortunately, due to the adaptive refinement procedures, errors are introduced which have to be taken into account. This paper is focussing on implementation details of the applied adaptive data structure management and a qualitative analysis of the introduced errors by analysing a Poisson problem on the given data structure, which has to be solved in every time step of a CFD analysis. Furthermore an adaptive CFD benchmark example is computed, showing the benefits of an adaptive refinement as well as measurements of parallel data distribution and performance. © 2013 IEEE.

  10. A structure-based approach to evaluation product adaptability in adaptable design

    International Nuclear Information System (INIS)

    Cheng, Qiang; Liu, Zhifeng; Cai, Ligang; Zhang, Guojun; Gu, Peihua

    2011-01-01

    Adaptable design, as a new design paradigm, involves creating designs and products that can be easily changed to satisfy different requirements. In this paper, two types of product adaptability are proposed as essential adaptability and behavioral adaptability, and through measuring which respectively a model for product adaptability evaluation is developed. The essential adaptability evaluation proceeds with analyzing the independencies of function requirements and function modules firstly based on axiomatic design, and measuring the adaptability of interfaces secondly with three indices. The behavioral adaptability reflected by the performance of adaptable requirements after adaptation is measured based on Kano model. At last, the effectiveness of the proposed method is demonstrated by an illustrative example of the motherboard of a personal computer. The results show that the method can evaluate and reveal the adaptability of a product in essence, and is of directive significance to improving design and innovative design

  11. Refficientlib: an efficient load-rebalanced adaptive mesh refinement algorithm for high-performance computational physics meshes

    OpenAIRE

    Baiges Aznar, Joan; Bayona Roa, Camilo Andrés

    2017-01-01

    No separate or additional fees are collected for access to or distribution of the work. In this paper we present a novel algorithm for adaptive mesh refinement in computational physics meshes in a distributed memory parallel setting. The proposed method is developed for nodally based parallel domain partitions where the nodes of the mesh belong to a single processor, whereas the elements can belong to multiple processors. Some of the main features of the algorithm presented in this paper a...

  12. Effects of age and cognition on a cross-cultural paediatric adaptation of the Sniffin' Sticks Identification Test.

    Science.gov (United States)

    Bastos, Laís Orrico Donnabella; Guerreiro, Marilisa Mantovani; Lees, Andrew John; Warner, Thomas T; Silveira-Moriyama, Laura

    2015-01-01

    To study the effects of age and cognition on the performance of children aged 3 to 18 years on a culturally adapted version of the 16 item smell identification test from Sniffin' Sticks (SS16). A series of pilots were conducted on 29 children aged 3 to 18 years old and 23 adults to produce an adapted version of the SS16 suitable for Brazilian children (SS16-Child). A final version was applied to 51 children alongside a picture identification test (PIT-SS16-Child) to access cognitive abilities involved in the smell identification task. In addition 20 adults performed the same tasks as a comparison group. The final adapted SS16-Child was applied to 51 children with a mean age of 9.9 years (range 3-18 years, SD=4.25 years), of which 68.3% were girls. There was an independent effect of age (pPIT-SS16-Child (p<0.001) on the performance on the SS16-Child, and older children reached the ceiling for scoring in the cognitive and olfactory test. Pre-school children had difficulties identifying items of the test. A cross-culturally adapted version of the SS16 can be used to test olfaction in children but interpretation of the results must take age and cognitive abilities into consideration.

  13. Effects of age and cognition on a cross-cultural paediatric adaptation of the Sniffin' Sticks Identification Test.

    Directory of Open Access Journals (Sweden)

    Laís Orrico Donnabella Bastos

    Full Text Available To study the effects of age and cognition on the performance of children aged 3 to 18 years on a culturally adapted version of the 16 item smell identification test from Sniffin' Sticks (SS16.A series of pilots were conducted on 29 children aged 3 to 18 years old and 23 adults to produce an adapted version of the SS16 suitable for Brazilian children (SS16-Child. A final version was applied to 51 children alongside a picture identification test (PIT-SS16-Child to access cognitive abilities involved in the smell identification task. In addition 20 adults performed the same tasks as a comparison group.The final adapted SS16-Child was applied to 51 children with a mean age of 9.9 years (range 3-18 years, SD=4.25 years, of which 68.3% were girls. There was an independent effect of age (p<0.05 and PIT-SS16-Child (p<0.001 on the performance on the SS16-Child, and older children reached the ceiling for scoring in the cognitive and olfactory test. Pre-school children had difficulties identifying items of the test.A cross-culturally adapted version of the SS16 can be used to test olfaction in children but interpretation of the results must take age and cognitive abilities into consideration.

  14. Development and Application of Detection Indices for Measuring Guessing Behaviors and Test-Taking Effort in Computerized Adaptive Testing

    Science.gov (United States)

    Chang, Shu-Ren; Plake, Barbara S.; Kramer, Gene A.; Lien, Shu-Mei

    2011-01-01

    This study examined the amount of time that different ability-level examinees spend on questions they answer correctly or incorrectly across different pretest item blocks presented on a fixed-length, time-restricted computerized adaptive testing (CAT). Results indicate that different ability-level examinees require different amounts of time to…

  15. Implementation of the Kids-CAT in clinical settings: a newly developed computer-adaptive test to facilitate the assessment of patient-reported outcomes of children and adolescents in clinical practice in Germany.

    Science.gov (United States)

    Barthel, D; Fischer, K I; Nolte, S; Otto, C; Meyrose, A-K; Reisinger, S; Dabs, M; Thyen, U; Klein, M; Muehlan, H; Ankermann, T; Walter, O; Rose, M; Ravens-Sieberer, U

    2016-03-01

    To describe the implementation process of a computer-adaptive test (CAT) for measuring health-related quality of life (HRQoL) of children and adolescents in two pediatric clinics in Germany. The study focuses on the feasibility and user experience with the Kids-CAT, particularly the patients' experience with the tool and the pediatricians' experience with the Kids-CAT Report. The Kids-CAT was completed by 312 children and adolescents with asthma, diabetes or rheumatoid arthritis. The test was applied during four clinical visits over a 1-year period. A feedback report with the test results was made available to the pediatricians. To assess both feasibility and acceptability, a multimethod research design was used. To assess the patients' experience with the tool, the children and adolescents completed a questionnaire. To assess the clinicians' experience, two focus groups were conducted with eight pediatricians. The children and adolescents indicated that the Kids-CAT was easy to complete. All pediatricians reported that the Kids-CAT was straightforward and easy to understand and integrate into clinical practice; they also expressed that routine implementation of the tool would be desirable and that the report was a valuable source of information, facilitating the assessment of self-reported HRQoL of their patients. The Kids-CAT was considered an efficient and valuable tool for assessing HRQoL in children and adolescents. The Kids-CAT Report promises to be a useful adjunct to standard clinical care with the potential to improve patient-physician communication, enabling pediatricians to evaluate and monitor their young patients' self-reported HRQoL.

  16. The use of computers for the performance and analysis of non-destructive testing

    International Nuclear Information System (INIS)

    Edelmann, X.; Pfister, O.

    1988-01-01

    Examples of the use of computers in non-destructive testing are related. Ultrasonic testing is especially addressed. The employment of computers means improvements for the user, the possibility of registering the reflector position, storage of test data and help with documentation. The test can be automated. The introduction of expert systems is expected for the future. 8 figs., 12 refs

  17. Adapting the Get Yourself Tested Campaign to Reach Black and Latino Sexual-Minority Youth.

    Science.gov (United States)

    Garbers, Samantha; Friedman, Allison; Martinez, Omar; Scheinmann, Roberta; Bermudez, Dayana; Silva, Manel; Silverman, Jen; Chiasson, Mary Ann

    2016-09-01

    Culturally appropriate efforts are needed to increase sexually transmitted disease (STD) testing and care among Black and Latino sexual-minority youth, who are at high risk for STDs. Get Yourself Tested, a national testing campaign, has demonstrated success among youth, but it has yet to be assessed for relevance or impact among this population. This effort included (1) formative and materials-testing research through focus groups; (2) adaptation of existing Get Yourself Tested campaign materials to be more inclusive of Black and Latino sexual-minority youth; (3) a 3-month campaign in four venues of New York City, promoting STD testing at events and through mobile testing and online and social media platforms; (4) process evaluation of outreach activities; and (5) an outcome evaluation of testing at select campaign venues, using a preexperimental design. During the 3-month campaign period, the number of STD tests conducted at select campaign venues increased from a comparable 3-month baseline period. Although testing uptake through mobile vans remained low in absolute numbers, the van drew a high-prevalence sample, with positivity rates of 26.9% for chlamydia and 11.5% for gonorrhea. This article documents the process and lessons learned from adapting and implementing a local campaign for Black and Latino sexual-minority youth. © 2016 Society for Public Health Education.

  18. Addressing unmet need for HIV testing in emergency care settings: a role for computer-facilitated rapid HIV testing?

    Science.gov (United States)

    Kurth, Ann E; Severynen, Anneleen; Spielberg, Freya

    2013-08-01

    HIV testing in emergency departments (EDs) remains underutilized. The authors evaluated a computer tool to facilitate rapid HIV testing in an urban ED. Randomly assigned nonacute adult ED patients were randomly assigned to a computer tool (CARE) and rapid HIV testing before a standard visit (n = 258) or to a standard visit (n = 259) with chart access. The authors assessed intervention acceptability and compared noted HIV risks. Participants were 56% nonWhite and 58% male; median age was 37 years. In the CARE arm, nearly all (251/258) of the patients completed the session and received HIV results; four declined to consent to the test. HIV risks were reported by 54% of users; one participant was confirmed HIV-positive, and two were confirmed false-positive (seroprevalence 0.4%, 95% CI [0.01, 2.2]). Half (55%) of the patients preferred computerized rather than face-to-face counseling for future HIV testing. In the standard arm, one HIV test and two referrals for testing occurred. Computer-facilitated HIV testing appears acceptable to ED patients. Future research should assess cost-effectiveness compared with staff-delivered approaches.

  19. Computer Game Lugram - Version for Blind Children

    Directory of Open Access Journals (Sweden)

    V. Delić

    2011-06-01

    Full Text Available Computer games have undoubtedly become an integral part of educational activities of children. However, since computer games typically abound with audio and visual effects, most of them are completely useless for children with disabilities. Specifically, computer games dealing with the basics of geometry can contribute to mathematics education, but they require significant modifications in order to be suitable for the visually impaired children. The paper presents the results of research and adaptation of the educational computer game Lugram to the needs of completely blind children, as well as the testing of the prototype, whose results are encouraging to further research and development in the same direction.

  20. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and distributed......Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  1. Adaptive weighted anisotropic diffusion for computed tomography denoising

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhi; Silver, Michael D. [Toshiba Medical Research Institute USA, Inc., Vernon Hills, IL (United States); Noshi, Yasuhiro [Toshiba Medical System Corporation, Tokyo (Japan)

    2011-07-01

    With increasing awareness of radiation safety, dose reduction has become an important task of modern CT system development. This paper proposes an adaptive weighted anisotropic diffusion method and an adaptive weighted sharp source anisotropic diffusion method as image domain filters to potentially help dose reduction. Different from existing anisotropic diffusion methods, the proposed methods incorporate an edge-sensitive adaptive source term as part of the diffusion iteration. It provides better edge and detail preservation. Visual evaluation showed that the new methods can reduce noise substantially without apparent edge and detail loss. The quantitative evaluations also showed over 50% of noise reduction in terms of noise standard deviations, which is equivalent to over 75% of dose reduction for a normal dose image quality. (orig.)

  2. Adaptive and Qualitative Changes in Encoding Strategy with Experience: Evidence from the Test-Expectancy Paradigm

    Science.gov (United States)

    Finley, Jason R.; Benjamin, Aaron S.

    2012-01-01

    Three experiments demonstrated learners' abilities to adaptively and qualitatively accommodate their encoding strategies to the demands of an upcoming test. Stimuli were word pairs. In Experiment 1, test expectancy was induced for either cued recall (of targets given cues) or free recall (of targets only) across 4 study-test cycles of the same…

  3. Psychometric evaluation of the EORTC computerized adaptive test (CAT) fatigue item pool

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Giesinger, Johannes M; Holzner, Bernhard

    2013-01-01

    Fatigue is one of the most common symptoms associated with cancer and its treatment. To obtain a more precise and flexible measure of fatigue, the EORTC Quality of Life Group has developed a computerized adaptive test (CAT) measure of fatigue. This is part of an ongoing project developing a CAT...

  4. Computer-based tests: The impact of test design and problem of equivalency

    Czech Academy of Sciences Publication Activity Database

    Květon, Petr; Jelínek, Martin; Vobořil, Dalibor; Klimusová, H.

    -, č. 23 (2007), s. 32-51 ISSN 0747-5632 R&D Projects: GA ČR(CZ) GA406/99/1052; GA AV ČR(CZ) KSK9058117 Institutional research plan: CEZ:AV0Z7025918 Keywords : Computer-based assessment * speeded test * equivalency Subject RIV: AN - Psychology Impact factor: 1.344, year: 2007

  5. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  6. A fully adaptive hybrid optimization of aircraft engine blades

    Science.gov (United States)

    Dumas, L.; Druez, B.; Lecerf, N.

    2009-10-01

    A new fully adaptive hybrid optimization method (AHM) has been developed and applied to an industrial problem in the field of the aircraft engine industry. The adaptivity of the coupling between a global search by a population-based method (Genetic Algorithms or Evolution Strategies) and the local search by a descent method has been particularly emphasized. On various analytical test cases, the AHM method overperforms the original global search method in terms of computational time and accuracy. The results obtained on the industrial case have also confirmed the interest of AHM for the design of new and original solutions in an affordable time.

  7. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  8. Intelligent Adaptation and Personalization Techniques in Computer-Supported Collaborative Learning

    CERN Document Server

    Demetriadis, Stavros; Xhafa, Fatos

    2012-01-01

    Adaptation and personalization have been extensively studied in CSCL research community aiming to design intelligent systems that adaptively support eLearning processes and collaboration. Yet, with the fast development in Internet technologies, especially with the emergence of new data technologies and the mobile technologies, new opportunities and perspectives are opened for advanced adaptive and personalized systems. Adaptation and personalization are posing new research and development challenges to nowadays CSCL systems. In particular, adaptation should be focused in a multi-dimensional way (cognitive, technological, context-aware and personal). Moreover, it should address the particularities of both individual learners and group collaboration. As a consequence, the aim of this book is twofold. On the one hand, it discusses the latest advances and findings in the area of intelligent adaptive and personalized learning systems. On the other hand it analyzes the new implementation perspectives for intelligen...

  9. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    International Nuclear Information System (INIS)

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies

  10. Unauthorised adaptation of computer programmes - is ...

    African Journals Online (AJOL)

    Haupt acquired copyright in the Data Explorer programme regardless of the fact that the programme was as a result of an unauthorised adaptation of the Project AMPS programme which belonged to Brewers Marketing Intelligence (Pty) Ltd. This case note inter alia analyses the possibility of an author being sued for ...

  11. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  12. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  13. Examining sensory ability, feature matching and assessment-based adaptation for a brain-computer interface using the steady-state visually evoked potential.

    Science.gov (United States)

    Brumberg, Jonathan S; Nguyen, Anh; Pitt, Kevin M; Lorenz, Sean D

    2018-01-31

    We investigated how overt visual attention and oculomotor control influence successful use of a visual feedback brain-computer interface (BCI) for accessing augmentative and alternative communication (AAC) devices in a heterogeneous population of individuals with profound neuromotor impairments. BCIs are often tested within a single patient population limiting generalization of results. This study focuses on examining individual sensory abilities with an eye toward possible interface adaptations to improve device performance. Five individuals with a range of neuromotor disorders participated in four-choice BCI control task involving the steady state visually evoked potential. The BCI graphical interface was designed to simulate a commercial AAC device to examine whether an integrated device could be used successfully by individuals with neuromotor impairment. All participants were able to interact with the BCI and highest performance was found for participants able to employ an overt visual attention strategy. For participants with visual deficits to due to impaired oculomotor control, effective performance increased after accounting for mismatches between the graphical layout and participant visual capabilities. As BCIs are translated from research environments to clinical applications, the assessment of BCI-related skills will help facilitate proper device selection and provide individuals who use BCI the greatest likelihood of immediate and long term communicative success. Overall, our results indicate that adaptations can be an effective strategy to reduce barriers and increase access to BCI technology. These efforts should be directed by comprehensive assessments for matching individuals to the most appropriate device to support their complex communication needs. Implications for Rehabilitation Brain computer interfaces using the steady state visually evoked potential can be integrated with an augmentative and alternative communication device to provide access

  14. Hierarchical adaptive experimental design for Gaussian process emulators

    International Nuclear Information System (INIS)

    Busby, Daniel

    2009-01-01

    Large computer simulators have usually complex and nonlinear input output functions. This complicated input output relation can be analyzed by global sensitivity analysis; however, this usually requires massive Monte Carlo simulations. To effectively reduce the number of simulations, statistical techniques such as Gaussian process emulators can be adopted. The accuracy and reliability of these emulators strongly depend on the experimental design where suitable evaluation points are selected. In this paper a new sequential design strategy called hierarchical adaptive design is proposed to obtain an accurate emulator using the least possible number of simulations. The hierarchical design proposed in this paper is tested on various standard analytic functions and on a challenging reservoir forecasting application. Comparisons with standard one-stage designs such as maximin latin hypercube designs show that the hierarchical adaptive design produces a more accurate emulator with the same number of computer experiments. Moreover a stopping criterion is proposed that enables to perform the number of simulations necessary to obtain required approximation accuracy.

  15. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  16. Cross-Cultural adaptation of an instrument to computer accessibility evaluation for students with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Gerusa Ferreira Lourenço

    2015-03-01

    Full Text Available The specific literature indicates that the successful education of children with cerebral palsy may require the implementation of appropriate assistive technology resources, allowing students to improve their performance and complete everyday tasks more efficiently and independently. To this end, these resources must be selected properly, emphasizing the importance of an appropriate initial assessment of the child and the possibilities of the resources available. The present study aimed to translate and adapt theoretically an American instrument that evaluates computer accessibility for people with cerebral palsy, in order to contextualize it for applicability to Brazilian students with cerebral palsy. The methodology involved the steps of translation and cross-cultural adaptation of this instrument, as well as the construction of a supplementary script for additional use of that instrument in the educational context. Translation procedures, theoretical and technical adaptation of the American instrument and theoretical analysis (content and semantics were carried out with the participation of professional experts of the special education area as adjudicators. The results pointed to the relevance of the proposal of the translated instrument in conjunction with the script built to the reality of professionals involved with the education of children with cerebral palsy, such as occupational therapists and special educators.

  17. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  18. [Studies on the changes of adaptation with children in the dental setting. The relationship between the changes of adaptation and various psychological tests].

    Science.gov (United States)

    Uchida, T; Mukai, Y; Sasa, R

    1991-01-01

    The purpose of this study was to discover the changes in the adaptation of children to the Dental setting, and to discover the relationship between the adaptation, and the personality of the child, the personality of the mother, as well as the relationship between the mother and child. The subjects were 60 two to six year old children and their mothers who visited at the Department of Pedodontics, School of Dentistry, Showa University. The results were as follows: 1) The changes of adaptation were classified in groups of four classes. Four groups: Continuous Adaptability (45.0%) Acquired Adaptability (18.3%) Continuous Inadaptability (16.7%) Extreme Inadaptability (20.0%) 2) The inadaptability groups (Continuous Inadaptability and Extreme Inadaptability) of the two to three year old children did not correlate to the change of adaptation and personality of the child, and the relationship between the mother and child. 3) The extreme inadaptability group with the four year old children showed a connection with the change of adaptation and the various Psychological Tests. Concerning personality, the children showed elements of "dependence" "retrogression" and "maladaptation to school (kindergarten)". Concerning the mother child relationship, there were elements of "anxiety" "dotage" "follow blindly" "disagreement". 4) Nobody showed extreme inadaptability in the group of five to six year old children. Continuous Inadaptability group with the five to six year old children showed scarcely any problems. 5) The Personality of mother did not correlate to the change of adaptation of children in the dental setting.

  19. Identifying genetic marker sets associated with phenotypes via an efficient adaptive score test

    KAUST Repository

    Cai, T.

    2012-06-25

    In recent years, genome-wide association studies (GWAS) and gene-expression profiling have generated a large number of valuable datasets for assessing how genetic variations are related to disease outcomes. With such datasets, it is often of interest to assess the overall effect of a set of genetic markers, assembled based on biological knowledge. Genetic marker-set analyses have been advocated as more reliable and powerful approaches compared with the traditional marginal approaches (Curtis and others, 2005. Pathways to the analysis of microarray data. TRENDS in Biotechnology 23, 429-435; Efroni and others, 2007. Identification of key processes underlying cancer phenotypes using biologic pathway analysis. PLoS One 2, 425). Procedures for testing the overall effect of a marker-set have been actively studied in recent years. For example, score tests derived under an Empirical Bayes (EB) framework (Liu and others, 2007. Semiparametric regression of multidimensional genetic pathway data: least-squares kernel machines and linear mixed models. Biometrics 63, 1079-1088; Liu and others, 2008. Estimation and testing for the effect of a genetic pathway on a disease outcome using logistic kernel machine regression via logistic mixed models. BMC bioinformatics 9, 292-2; Wu and others, 2010. Powerful SNP-set analysis for case-control genome-wide association studies. American Journal of Human Genetics 86, 929) have been proposed as powerful alternatives to the standard Rao score test (Rao, 1948. Large sample tests of statistical hypotheses concerning several parameters with applications to problems of estimation. Mathematical Proceedings of the Cambridge Philosophical Society, 44, 50-57). The advantages of these EB-based tests are most apparent when the markers are correlated, due to the reduction in the degrees of freedom. In this paper, we propose an adaptive score test which up- or down-weights the contributions from each member of the marker-set based on the Z-scores of

  20. Seismic proving test of process computer systems with a seismic floor isolation system

    International Nuclear Information System (INIS)

    Fujimoto, S.; Niwa, H.; Kondo, H.

    1995-01-01

    The authors have carried out seismic proving tests for process computer systems as a Nuclear Power Engineering Corporation (NUPEC) project sponsored by the Ministry of International Trade and Industry (MITI). This paper presents the seismic test results for evaluating functional capabilities of process computer systems with a seismic floor isolation system. The seismic floor isolation system to isolate the horizontal motion was composed of a floor frame (13 m x 13 m), ball bearing units, and spring-damper units. A series of seismic excitation tests was carried out using a large-scale shaking table of NUPEC. From the test results, the functional capabilities during large earthquakes of computer systems with a seismic floor isolation system were verified

  1. Application of computer techniques to charpy impact testing of irradiated pressure vessel steels

    International Nuclear Information System (INIS)

    Landow, M.P.; Fromm, E.O.; Perrin, J.S.

    1982-01-01

    A Rockwell AIM 65 microcomputer has been modified to control a remote Charpy V-notch impact test machine. It controls not only handling and testing of the specimen but also transference and storage of instrumented Charpy test data. A system of electrical solenoid activated pneumatic cylinders and switches provides the interface between the computer and the test apparatus. A command language has been designated that allows the operator to command checkout, test procedure, and data storage via the computer. Automatic compliance with ASTM test procedures is built into the program

  2. Screening for cognitive impairment in older individuals. Validation study of a computer-based test.

    Science.gov (United States)

    Green, R C; Green, J; Harrison, J M; Kutner, M H

    1994-08-01

    This study examined the validity of a computer-based cognitive test that was recently designed to screen the elderly for cognitive impairment. Criterion-related validity was examined by comparing test scores of impaired patients and normal control subjects. Construct-related validity was computed through correlations between computer-based subtests and related conventional neuropsychological subtests. University center for memory disorders. Fifty-two patients with mild cognitive impairment by strict clinical criteria and 50 unimpaired, age- and education-matched control subjects. Control subjects were rigorously screened by neurological, neuropsychological, imaging, and electrophysiological criteria to identify and exclude individuals with occult abnormalities. Using a cut-off total score of 126, this computer-based instrument had a sensitivity of 0.83 and a specificity of 0.96. Using a prevalence estimate of 10%, predictive values, positive and negative, were 0.70 and 0.96, respectively. Computer-based subtests correlated significantly with conventional neuropsychological tests measuring similar cognitive domains. Thirteen (17.8%) of 73 volunteers with normal medical histories were excluded from the control group, with unsuspected abnormalities on standard neuropsychological tests, electroencephalograms, or magnetic resonance imaging scans. Computer-based testing is a valid screening methodology for the detection of mild cognitive impairment in the elderly, although this particular test has important limitations. Broader applications of computer-based testing will require extensive population-based validation. Future studies should recognize that normal control subjects without a history of disease who are typically used in validation studies may have a high incidence of unsuspected abnormalities on neurodiagnostic studies.

  3. Development of computerized adaptive testing (CAT) for the EORTC QLQ-C30 physical functioning dimension

    DEFF Research Database (Denmark)

    Petersen, Morten Aa; Groenvold, Mogens; Aaronson, Neil K

    2011-01-01

    Computerized adaptive test (CAT) methods, based on item response theory (IRT), enable a patient-reported outcome instrument to be adapted to the individual patient while maintaining direct comparability of scores. The EORTC Quality of Life Group is developing a CAT version of the widely used EORTC...... QLQ-C30. We present the development and psychometric validation of the item pool for the first of the scales, physical functioning (PF)....

  4. Measurement of flat samples with rough surfaces by Magnetic Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Tomáš, Ivan; Kadlecová, Jana; Vértesy, G.

    2012-01-01

    Roč. 48, č. 4 (2012), s. 1441-1444 ISSN 0018-9464. [Conference on Soft Magnetic Materials (SMM20) /20./. Kos Island, 18.09.2011-22.09.2011] R&D Projects: GA ČR GA101/09/1323 Institutional research plan: CEZ:AV0Z10100520 Keywords : magnetic contact * magnetic adaptive testing * magnetically open samples * magnetic NDE Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.422, year: 2012

  5. SAGE - MULTIDIMENSIONAL SELF-ADAPTIVE GRID CODE

    Science.gov (United States)

    Davies, C. B.

    1994-01-01

    SAGE, Self Adaptive Grid codE, is a flexible tool for adapting and restructuring both 2D and 3D grids. Solution-adaptive grid methods are useful tools for efficient and accurate flow predictions. In supersonic and hypersonic flows, strong gradient regions such as shocks, contact discontinuities, shear layers, etc., require careful distribution of grid points to minimize grid error and produce accurate flow-field predictions. SAGE helps the user obtain more accurate solutions by intelligently redistributing (i.e. adapting) the original grid points based on an initial or interim flow-field solution. The user then computes a new solution using the adapted grid as input to the flow solver. The adaptive-grid methodology poses the problem in an algebraic, unidirectional manner for multi-dimensional adaptations. The procedure is analogous to applying tension and torsion spring forces proportional to the local flow gradient at every grid point and finding the equilibrium position of the resulting system of grid points. The multi-dimensional problem of grid adaption is split into a series of one-dimensional problems along the computational coordinate lines. The reduced one dimensional problem then requires a tridiagonal solver to find the location of grid points along a coordinate line. Multi-directional adaption is achieved by the sequential application of the method in each coordinate direction. The tension forces direct the redistribution of points to the strong gradient region. To maintain smoothness and a measure of orthogonality of grid lines, torsional forces are introduced that relate information between the family of lines adjacent to one another. The smoothness and orthogonality constraints are direction-dependent, since they relate only the coordinate lines that are being adapted to the neighboring lines that have already been adapted. Therefore the solutions are non-unique and depend on the order and direction of adaption. Non-uniqueness of the adapted grid is

  6. Journal of Computer Science and Its Application - Vol 20, No 2 (2013)

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application - Vol 20, No 2 (2013) ... Fuzzy analysis and adaptive anthropometry model for object identification in surveillance system ... Natural language processing techniques for automatic test questions ...

  7. Implementation and analysis of an adaptive multilevel Monte Carlo algorithm

    KAUST Repository

    Hoel, Hakon; Von Schwerin, Erik; Szepessy, Anders; Tempone, Raul

    2014-01-01

    We present an adaptive multilevel Monte Carlo (MLMC) method for weak approximations of solutions to Itô stochastic dierential equations (SDE). The work [11] proposed and analyzed an MLMC method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a single level Euler-Maruyama Monte Carlo method from O(TOL-3) to O(TOL-2 log(TOL-1)2) for a mean square error of O(TOL2). Later, the work [17] presented an MLMC method using a hierarchy of adaptively re ned, non-uniform time discretizations, and, as such, it may be considered a generalization of the uniform time discretizationMLMC method. This work improves the adaptiveMLMC algorithms presented in [17] and it also provides mathematical analysis of the improved algorithms. In particular, we show that under some assumptions our adaptive MLMC algorithms are asymptotically accurate and essentially have the correct complexity but with improved control of the complexity constant factor in the asymptotic analysis. Numerical tests include one case with singular drift and one with stopped diusion, where the complexity of a uniform single level method is O(TOL-4). For both these cases the results con rm the theory, exhibiting savings in the computational cost for achieving the accuracy O(TOL) from O(TOL-3) for the adaptive single level algorithm to essentially O(TOL-2 log(TOL-1)2) for the adaptive MLMC algorithm. © 2014 by Walter de Gruyter Berlin/Boston 2014.

  8. Adaptive MPC based on MIMO ARX-Laguerre model.

    Science.gov (United States)

    Ben Abdelwahed, Imen; Mbarek, Abdelkader; Bouzrara, Kais

    2017-03-01

    This paper proposes a method for synthesizing an adaptive predictive controller using a reduced complexity model. This latter is given by the projection of the ARX model on Laguerre bases. The resulting model is entitled MIMO ARX-Laguerre and it is characterized by an easy recursive representation. The adaptive predictive control law is computed based on multi-step-ahead finite-element predictors, identified directly from experimental input/output data. The model is tuned in each iteration by an online identification algorithms of both model parameters and Laguerre poles. The proposed approach avoids time consuming numerical optimization algorithms associated with most common linear predictive control strategies, which makes it suitable for real-time implementation. The method is used to synthesize and test in numerical simulations adaptive predictive controllers for the CSTR process benchmark. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  10. A case study of evolutionary computation of biochemical adaptation

    International Nuclear Information System (INIS)

    François, Paul; Siggia, Eric D

    2008-01-01

    Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein–protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature

  11. USSR Report, Cybernetics, Computers and Automation Technology

    Science.gov (United States)

    1987-03-31

    version of the system was tested by adapting PAL-11 and MACRO-11 assembly code for the "Elektronika=60" and "Elektronika-60M" computers; ASM -86 for the...GS, "On the Results of Evaluation of Insurance Payments in Collective and State Farms and Private Households," the actuarial analysis tables based

  12. Security Considerations and Recommendations in Computer-Based Testing

    Directory of Open Access Journals (Sweden)

    Saleh M. Al-Saleem

    2014-01-01

    Full Text Available Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT. However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password in order to check the identity and authenticity of the examinee.

  13. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  14. Computer control and data acquisition system for the R.F. Test Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper

  15. Quality based approach for adaptive face recognition

    Science.gov (United States)

    Abboud, Ali J.; Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Recent advances in biometric technology have pushed towards more robust and reliable systems. We aim to build systems that have low recognition errors and are less affected by variation in recording conditions. Recognition errors are often attributed to the usage of low quality biometric samples. Hence, there is a need to develop new intelligent techniques and strategies to automatically measure/quantify the quality of biometric image samples and if necessary restore image quality according to the need of the intended application. In this paper, we present no-reference image quality measures in the spatial domain that have impact on face recognition. The first is called symmetrical adaptive local quality index (SALQI) and the second is called middle halve (MH). Also, an adaptive strategy has been developed to select the best way to restore the image quality, called symmetrical adaptive histogram equalization (SAHE). The main benefits of using quality measures for adaptive strategy are: (1) avoidance of excessive unnecessary enhancement procedures that may cause undesired artifacts, and (2) reduced computational complexity which is essential for real time applications. We test the success of the proposed measures and adaptive approach for a wavelet-based face recognition system that uses the nearest neighborhood classifier. We shall demonstrate noticeable improvements in the performance of adaptive face recognition system over the corresponding non-adaptive scheme.

  16. Computer-facilitated rapid HIV testing in emergency care settings: provider and patient usability and acceptability.

    Science.gov (United States)

    Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard

    2011-06-01

    Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.

  17. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    Science.gov (United States)

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  18. Spatially adaptive hp refinement approach for PN neutron transport equation using spectral element method

    International Nuclear Information System (INIS)

    Nahavandi, N.; Minuchehr, A.; Zolfaghari, A.; Abbasi, M.

    2015-01-01

    Highlights: • Powerful hp-SEM refinement approach for P N neutron transport equation has been presented. • The method provides great geometrical flexibility and lower computational cost. • There is a capability of using arbitrary high order and non uniform meshes. • Both posteriori and priori local error estimation approaches have been employed. • High accurate results are compared against other common adaptive and uniform grids. - Abstract: In this work we presented the adaptive hp-SEM approach which is obtained from the incorporation of Spectral Element Method (SEM) and adaptive hp refinement. The SEM nodal discretization and hp adaptive grid-refinement for even-parity Boltzmann neutron transport equation creates powerful grid refinement approach with high accuracy solutions. In this regard a computer code has been developed to solve multi-group neutron transport equation in one-dimensional geometry using even-parity transport theory. The spatial dependence of flux has been developed via SEM method with Lobatto orthogonal polynomial. Two commonly error estimation approaches, the posteriori and the priori has been implemented. The incorporation of SEM nodal discretization method and adaptive hp grid refinement leads to high accurate solutions. Coarser meshes efficiency and significant reduction of computer program runtime in comparison with other common refining methods and uniform meshing approaches is tested along several well-known transport benchmarks

  19. Using the U.S. "Test of Financial Literacy" in Germany--Adaptation and Validation

    Science.gov (United States)

    Förster, Manuel; Happ, Roland; Molerov, Dimitar

    2017-01-01

    In this article, the authors present the adaptation and validation processes conducted to render the American "Test of Financial Literacy" (TFL) suitable for use in Germany (TFL-G). First, they outline the translation procedure followed and the various cultural adjustments made in line with international standards. Next, they present…

  20. Relative User Ratings of MMPI-2 Computer-Based Test Interpretations

    Science.gov (United States)

    Williams, John E.; Weed, Nathan C.

    2004-01-01

    There are eight commercially available computer-based test interpretations (CBTIs) for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), of which few have been empirically evaluated. Prospective users of these programs have little scientific data to guide choice of a program. This study compared ratings of these eight CBTIs. Test users…

  1. Teste de Conconi adaptado para bicicleta aquática Conconi test adapted to aquatic bicycle

    Directory of Open Access Journals (Sweden)

    Jonas Neves Martins

    2007-10-01

    Full Text Available A prática regular de exercícios físicos tem sido considerada um dos mecanismos que auxiliam a melhoria de padrões da saúde e de qualidade de vida. Em conseqüência do crescimento da procura por academias de ginástica, as atividades físicas no meio líquido, com destaque para a bicicleta aquática, têm aumentado nos últimos anos. No entanto, há ainda carência de métodos para a avaliação e prescrição do treinamento aeróbio neste tipo de equipamento. O objetivo deste estudo foi propor uma adaptação do teste de Conconi et al. (1982 para bicicleta aquática. Foram testados 27 participantes (24 ± 6 anos, 171 ± 8cm, 66 ± 12kg 15 do sexo masculino e 12 do feminino. Os participantes foram submetidos a um teste progressivo, realizado em bicicleta aquática, com carga inicial de 50RPM e incremento de 3RPM a cada minuto, até a exaustão. A FC foi registrada durante todo o teste. Para análise dos dados, foi utilizada estatística descritiva e o teste "t" de Student (P Physical exercise has been considered one of the mechanisms that improve health and quality of life. As a consequence of the enhanced demand for fitness centers, physical activities in liquid environment, especially aquatic cycling, have increased in the last years. However, methods of assessment and prescription of aerobic training in these equipments are still scarce. The objective of this study was to propose an adapted test of Conconi et al (1982 to aquatic bicycle. 27 participants (24 ± 6 years, 171 ± 8 cm, 66 ± 12 kg, 15 male and 12 female, were assessed. The participants have been submitted to a graded test in aquatic bicycle, with initial load of 50 RPM and increments of 3 RPM each minute, until exhaustion. HR was registered during the entire test. For data analysis, descriptive statistics were used as well as Student "t" test for comparison between genders. HRDP was identified in 85% of the subjects. There were not significant differences in HRmax (181 ± 12

  2. Strategies for Controlling Item Exposure in Computerized Adaptive Testing with the Generalized Partial Credit Model

    Science.gov (United States)

    Davis, Laurie Laughlin

    2004-01-01

    Choosing a strategy for controlling item exposure has become an integral part of test development for computerized adaptive testing (CAT). This study investigated the performance of six procedures for controlling item exposure in a series of simulated CATs under the generalized partial credit model. In addition to a no-exposure control baseline…

  3. Exaptation in human evolution: how to test adaptive vs exaptive evolutionary hypotheses.

    Science.gov (United States)

    Pievani, Telmo; Serrelli, Emanuele

    2011-01-01

    Palaeontologists, Stephen J. Gould and Elisabeth Vrba, introduced the term "ex-aptation" with the aim of improving and enlarging the scientific language available to researchers studying the evolution of any useful character, instead of calling it an "adaptation" by default, coming up with what Gould named an "extended taxonomy of fitness". With the extension to functional co-optations from non-adaptive structures ("spandrels"), the notion of exaptation expanded and revised the neo-Darwinian concept of "pre-adaptation" (which was misleading, for Gould and Vrba, suggesting foreordination). Exaptation is neither a "saltationist" nor an "anti-Darwinian" concept and, since 1982, has been adopted by many researchers in evolutionary and molecular biology, and particularly in human evolution. Exaptation has also been contested. Objections include the "non-operationality objection".We analyze the possible operationalization of this concept in two recent studies, and identify six directions of empirical research, which are necessary to test "adaptive vs. exaptive" evolutionary hypotheses. We then comment on a comprehensive survey of literature (available online), and on the basis of this we make a quantitative and qualitative evaluation of the adoption of the term among scientists who study human evolution. We discuss the epistemic conditions that may have influenced the adoption and appropriate use of exaptation, and comment on the benefits of an "extended taxonomy of fitness" in present and future studies concerning human evolution.

  4. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    Science.gov (United States)

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  5. Sample test cases using the environmental computer code NECTAR

    International Nuclear Information System (INIS)

    Ponting, A.C.

    1984-06-01

    This note demonstrates a few of the many different ways in which the environmental computer code NECTAR may be used. Four sample test cases are presented and described to show how NECTAR input data are structured. Edited output is also presented to illustrate the format of the results. Two test cases demonstrate how NECTAR may be used to study radio-isotopes not explicitly included in the code. (U.K.)

  6. Boltzmann Solver with Adaptive Mesh in Velocity Space

    International Nuclear Information System (INIS)

    Kolobov, Vladimir I.; Arslanbekov, Robert R.; Frolova, Anna A.

    2011-01-01

    We describe the implementation of direct Boltzmann solver with Adaptive Mesh in Velocity Space (AMVS) using quad/octree data structure. The benefits of the AMVS technique are demonstrated for the charged particle transport in weakly ionized plasmas where the collision integral is linear. We also describe the implementation of AMVS for the nonlinear Boltzmann collision integral. Test computations demonstrate both advantages and deficiencies of the current method for calculations of narrow-kernel distributions.

  7. Research on computer aided testing of pilot response to critical in-flight events

    Science.gov (United States)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  8. The full spectrum of climate change adaptation: testing an analytical framework in Tyrolean mountain agriculture (Austria).

    Science.gov (United States)

    Grüneis, Heidelinde; Penker, Marianne; Höferl, Karl-Michael

    2016-01-01

    Our scientific view on climate change adaptation (CCA) is unsatisfying in many ways: It is often dominated by a modernistic perspective of planned pro-active adaptation, with a selective focus on measures directly responding to climate change impacts and thus it is far from real-life conditions of those who are actually affected by climate change. Farmers have to simultaneously adapt to multiple changes. Therefore, also empirical climate change adaptation research needs a more integrative perspective on real-life climate change adaptations. This also has to consider "hidden" adaptations, which are not explicitly and directly motivated by CCA but actually contribute to the sector's adaptability to climate change. The aim of the present study is to develop and test an analytic framework that contributes to a broader understanding of CCA and to bridge the gap between scientific expertise and practical action. The framework distinguishes three types of CCA according to their climate related motivations: explicit adaptations, multi-purpose adaptations, and hidden adaptations. Although agriculture is among the sectors that are most affected by climate change, results from the case study of Tyrolean mountain agriculture show that climate change is ranked behind other more pressing "real-life-challenges" such as changing agricultural policies or market conditions. We identified numerous hidden adaptations which make a valuable contribution when dealing with climate change impacts. We conclude that these hidden adaptations have not only to be considered to get an integrative und more realistic view on CCA; they also provide a great opportunity for linking adaptation strategies to farmers' realities.

  9. Using a "time machine" to test for local adaptation of aquatic microbes to temporal and spatial environmental variation.

    Science.gov (United States)

    Fox, Jeremy W; Harder, Lawrence D

    2015-01-01

    Local adaptation occurs when different environments are dominated by different specialist genotypes, each of which is relatively fit in its local conditions and relatively unfit under other conditions. Analogously, ecological species sorting occurs when different environments are dominated by different competing species, each of which is relatively fit in its local conditions. The simplest theory predicts that spatial, but not temporal, environmental variation selects for local adaptation (or generates species sorting), but this prediction is difficult to test. Although organisms can be reciprocally transplanted among sites, doing so among times seems implausible. Here, we describe a reciprocal transplant experiment testing for local adaptation or species sorting of lake bacteria in response to both temporal and spatial variation in water chemistry. The experiment used a -80°C freezer as a "time machine." Bacterial isolates and water samples were frozen for later use, allowing transplantation of older isolates "forward in time" and newer isolates "backward in time." Surprisingly, local maladaptation predominated over local adaptation in both space and time. Such local maladaptation may indicate that adaptation, or the analogous species sorting process, fails to keep pace with temporal fluctuations in water chemistry. This hypothesis could be tested with more finely resolved temporal data. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.

  10. A brain-computer interface controlled mail client.

    Science.gov (United States)

    Yu, Tianyou; Li, Yuanqing; Long, Jinyi; Wang, Cong

    2013-01-01

    In this paper, we propose a brain-computer interface (BCI) based mail client. This system is controlled by hybrid features extracted from scalp-recorded electroencephalographic (EEG). We emulate the computer mouse by the motor imagery-based mu rhythm and the P300 potential. Furthermore, an adaptive P300 speller is included to provide text input function. With this BCI mail client, users can receive, read, write mails, as well as attach files in mail writing. The system has been tested on 3 subjects. Experimental results show that mail communication with this system is feasible.

  11. Direct numerical simulation of bubbles with adaptive mesh refinement with distributed algorithms

    International Nuclear Information System (INIS)

    Talpaert, Arthur

    2017-01-01

    This PhD work presents the implementation of the simulation of two-phase flows in conditions of water-cooled nuclear reactors, at the scale of individual bubbles. To achieve that, we study several models for Thermal-Hydraulic flows and we focus on a technique for the capture of the thin interface between liquid and vapour phases. We thus review some possible techniques for adaptive Mesh Refinement (AMR) and provide algorithmic and computational tools adapted to patch-based AMR, which aim is to locally improve the precision in regions of interest. More precisely, we introduce a patch-covering algorithm designed with balanced parallel computing in mind. This approach lets us finely capture changes located at the interface, as we show for advection test cases as well as for models with hyperbolic-elliptic coupling. The computations we present also include the simulation of the incompressible Navier-Stokes system, which models the shape changes of the interface between two non-miscible fluids. (author) [fr

  12. A Feedback Control Strategy for Enhancing Item Selection Efficiency in Computerized Adaptive Testing

    Science.gov (United States)

    Weissman, Alexander

    2006-01-01

    A computerized adaptive test (CAT) may be modeled as a closed-loop system, where item selection is influenced by trait level ([theta]) estimation and vice versa. When discrepancies exist between an examinee's estimated and true [theta] levels, nonoptimal item selection is a likely result. Nevertheless, examinee response behavior consistent with…

  13. Keyboards: from Typewriters to Tablet Computers

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2014-06-01

    Full Text Available The evolution of Lithuanian keyboards is reviewed. Keyboards are divided up to three categories according to flexibility of their adaptation for typing of Lithuanian texts: 1 mechanical typewriter keyboards (heavily adaptable, 2 electromechanical desktop or laptop computer keyboards, and 3 programmable touch screen tablet computer keyboards (easily adaptable. It is discussed how they were adapted for Lithuanian language, with solutions in other languages are compared. Both successful and unsuccessful solutions are discussed. The reasons of failures as well as their negative impact on writing culture and formation of bad habits in the work with computer are analyzed. The recommendations how to improve current situation are presented.

  14. Cultural Adaptation of the Portuguese Version of the "Sniffin' Sticks" Smell Test: Reliability, Validity, and Normative Data.

    Science.gov (United States)

    Ribeiro, João Carlos; Simões, João; Silva, Filipe; Silva, Eduardo D; Hummel, Cornelia; Hummel, Thomas; Paiva, António

    2016-01-01

    The cross-cultural adaptation and validation of the Sniffin`Sticks test for the Portuguese population is described. Over 270 people participated in four experiments. In Experiment 1, 67 participants rated the familiarity of presented odors and seven descriptors of the original test were adapted to a Portuguese context. In Experiment 2, the Portuguese version of Sniffin`Sticks test was administered to 203 healthy participants. Older age, male gender and active smoking status were confirmed as confounding factors. The third experiment showed the validity of the Portuguese version of Sniffin`Sticks test in discriminating healthy controls from patients with olfactory dysfunction. In Experiment 4, the test-retest reliability for both the composite score (r71 = 0.86) and the identification test (r71 = 0.62) was established (pPortuguese version of Sniffin`Sticks test is provided, showing good validity and reliability and effectively distinguishing patients from healthy controls with high sensitivity and specificity. The Portuguese version of Sniffin`Sticks test identification test is a clinically suitable screening tool in routine outpatient Portuguese settings.

  15. Pepsi-SAXS : an adaptive method for rapid and accurate computation of small-angle X-ray scattering profiles

    OpenAIRE

    Grudinin , Sergei; Garkavenko , Maria; Kazennov , Andrei

    2017-01-01

    International audience; A new method called Pepsi-SAXS is presented that calculates small-angle X-ray scattering profiles from atomistic models. The method is based on the multipole expansion scheme and is significantly faster compared with other tested methods. In particular, using the Nyquist–Shannon–Kotelnikov sampling theorem, the multipole expansion order is adapted to the size of the model and the resolution of the experimental data. It is argued that by using the adaptive expansion ord...

  16. Unstructured mesh adaptivity for urban flooding modelling

    Science.gov (United States)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  17. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  18. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  19. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  20. Computed Tomography Image Quality Evaluation of a New Iterative Reconstruction Algorithm in the Abdomen (Adaptive Statistical Iterative Reconstruction-V) a Comparison With Model-Based Iterative Reconstruction, Adaptive Statistical Iterative Reconstruction, and Filtered Back Projection Reconstructions.

    Science.gov (United States)

    Goodenberger, Martin H; Wagner-Bartak, Nicolaus A; Gupta, Shiva; Liu, Xinming; Yap, Ramon Q; Sun, Jia; Tamm, Eric P; Jensen, Corey T

    The purpose of this study was to compare abdominopelvic computed tomography images reconstructed with adaptive statistical iterative reconstruction-V (ASIR-V) with model-based iterative reconstruction (Veo 3.0), ASIR, and filtered back projection (FBP). Abdominopelvic computed tomography scans for 36 patients (26 males and 10 females) were reconstructed using FBP, ASIR (80%), Veo 3.0, and ASIR-V (30%, 60%, 90%). Mean ± SD patient age was 32 ± 10 years with mean ± SD body mass index of 26.9 ± 4.4 kg/m. Images were reviewed by 2 independent readers in a blinded, randomized fashion. Hounsfield unit, noise, and contrast-to-noise ratio (CNR) values were calculated for each reconstruction algorithm for further comparison. Phantom evaluation of low-contrast detectability (LCD) and high-contrast resolution was performed. Adaptive statistical iterative reconstruction-V 30%, ASIR-V 60%, and ASIR 80% were generally superior qualitatively compared with ASIR-V 90%, Veo 3.0, and FBP (P ASIR-V 60% with respective CNR values of 5.54 ± 2.39, 8.78 ± 3.15, and 3.49 ± 1.77 (P ASIR 80% had the best and worst spatial resolution, respectively. Adaptive statistical iterative reconstruction-V 30% and ASIR-V 60% provided the best combination of qualitative and quantitative performance. Adaptive statistical iterative reconstruction 80% was equivalent qualitatively, but demonstrated inferior spatial resolution and LCD.

  1. Usability testing of Avoiding Diabetes Thru Action Plan Targeting (ADAPT) decision support for integrating care-based counseling of pre-diabetes in an electronic health record.

    Science.gov (United States)

    Chrimes, Dillon; Kitos, Nicole R; Kushniruk, Andre; Mann, Devin M

    2014-09-01

    Usability testing can be used to evaluate human-computer interaction (HCI) and communication in shared decision making (SDM) for patient-provider behavioral change and behavioral contracting. Traditional evaluations of usability using scripted or mock patient scenarios with think-aloud protocol analysis provide a way to identify HCI issues. In this paper we describe the application of these methods in the evaluation of the Avoiding Diabetes Thru Action Plan Targeting (ADAPT) tool, and test the usability of the tool to support the ADAPT framework for integrated care counseling of pre-diabetes. The think-aloud protocol analysis typically does not provide an assessment of how patient-provider interactions are effected in "live" clinical workflow or whether a tool is successful. Therefore, "Near-live" clinical simulations involving applied simulation methods were used to compliment the think-aloud results. This complementary usability technique was used to test the end-user HCI and tool performance by more closely mimicking the clinical workflow and capturing interaction sequences along with assessing the functionality of computer module prototypes on clinician workflow. We expected this method to further complement and provide different usability findings as compared to think-aloud analysis. Together, this mixed method evaluation provided comprehensive and realistic feedback for iterative refinement of the ADAPT system prior to implementation. The study employed two phases of testing of a new interactive ADAPT tool that embedded an evidence-based shared goal setting component into primary care workflow for dealing with pre-diabetes counseling within a commercial physician office electronic health record (EHR). Phase I applied usability testing that involved "think-aloud" protocol analysis of eight primary care providers interacting with several scripted clinical scenarios. Phase II used "near-live" clinical simulations of five providers interacting with standardized

  2. Error Decomposition and Adaptivity for Response Surface Approximations from PDEs with Parametric Uncertainty

    KAUST Repository

    Bryant, C. M.; Prudhomme, S.; Wildey, T.

    2015-01-01

    In this work, we investigate adaptive approaches to control errors in response surface approximations computed from numerical approximations of differential equations with uncertain or random data and coefficients. The adaptivity of the response surface approximation is based on a posteriori error estimation, and the approach relies on the ability to decompose the a posteriori error estimate into contributions from the physical discretization and the approximation in parameter space. Errors are evaluated in terms of linear quantities of interest using adjoint-based methodologies. We demonstrate that a significant reduction in the computational cost required to reach a given error tolerance can be achieved by refining the dominant error contributions rather than uniformly refining both the physical and stochastic discretization. Error decomposition is demonstrated for a two-dimensional flow problem, and adaptive procedures are tested on a convection-diffusion problem with discontinuous parameter dependence and a diffusion problem, where the diffusion coefficient is characterized by a 10-dimensional parameter space.

  3. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  4. Normalised subband adaptive filtering with extended adaptiveness on degree of subband filters

    Science.gov (United States)

    Samuyelu, Bommu; Rajesh Kumar, Pullakura

    2017-12-01

    This paper proposes an adaptive normalised subband adaptive filtering (NSAF) to accomplish the betterment of NSAF performance. In the proposed NSAF, an extended adaptiveness is introduced from its variants in two ways. In the first way, the step-size is set adaptive, and in the second way, the selection of subbands is set adaptive. Hence, the proposed NSAF is termed here as variable step-size-based NSAF with selected subbands (VS-SNSAF). Experimental investigations are carried out to demonstrate the performance (in terms of convergence) of the VS-SNSAF against the conventional NSAF and its state-of-the-art adaptive variants. The results report the superior performance of VS-SNSAF over the traditional NSAF and its variants. It is also proved for its stability, robustness against noise and substantial computing complexity.

  5. The students’ ability in mathematical literacy for the quantity, and the change and relationship problems on the PISA adaptation test

    Science.gov (United States)

    Julie, Hongki; Sanjaya, Febi; Yudhi Anggoro, Ant.

    2017-09-01

    One of purposes of this study was to describe the solution profile of the junior high school students for the PISA adaptation test. The procedures conducted by researchers to achieve this objective were (1) adapting the PISA test, (2) validating the adapting PISA test, (3) asking junior high school students to do the adapting PISA test, and (4) making the students’ solution profile. The PISA problems for mathematics could be classified into four areas, namely quantity, space and shape, change and relationship, and uncertainty. The research results that would be presented in this paper were the result test for quantity, and change and relationship problems. In the adapting PISA test, there were fifteen questions that consist of two questions for the quantity group, six questions for space and shape group, three questions for the change and relationship group, and four questions for uncertainty. Subjects in this study were 18 students from 11 junior high schools in Yogyakarta, Central Java, and Banten. The type of research that used by the researchers was a qualitative research. For the first quantity problem, there were 38.89 % students who achieved level 3. For the second quantity problem, there were 88.89 % students who achieved level 2. For part a of the first change and relationship problem, there were 55.56 % students who achieved level 5. For part b of the first change and relationship problem, there were 77.78 % students who achieved level 2. For the second change and relationship problem, there were 38.89 % students who achieved level 2.

  6. Design, realization and structural testing of a compliant adaptable wing

    International Nuclear Information System (INIS)

    Molinari, G; Arrieta, A F; Ermanni, P; Quack, M; Morari, M

    2015-01-01

    This paper presents the design, optimization, realization and testing of a novel wing morphing concept, based on distributed compliance structures, and actuated by piezoelectric elements. The adaptive wing features ribs with a selectively compliant inner structure, numerically optimized to achieve aerodynamically efficient shape changes while simultaneously withstanding aeroelastic loads. The static and dynamic aeroelastic behavior of the wing, and the effect of activating the actuators, is assessed by means of coupled 3D aerodynamic and structural simulations. To demonstrate the capabilities of the proposed morphing concept and optimization procedure, the wings of a model airplane are designed and manufactured according to the presented approach. The goal is to replace conventional ailerons, thus to achieve controllability in roll purely by morphing. The mechanical properties of the manufactured components are characterized experimentally, and used to create a refined and correlated finite element model. The overall stiffness, strength, and actuation capabilities are experimentally tested and successfully compared with the numerical prediction. To counteract the nonlinear hysteretic behavior of the piezoelectric actuators, a closed-loop controller is implemented, and its capability of accurately achieving the desired shape adaptation is evaluated experimentally. Using the correlated finite element model, the aeroelastic behavior of the manufactured wing is simulated, showing that the morphing concept can provide sufficient roll authority to allow controllability of the flight. The additional degrees of freedom offered by morphing can be also used to vary the plane lift coefficient, similarly to conventional flaps. The efficiency improvements offered by this technique are evaluated numerically, and compared to the performance of a rigid wing. (paper)

  7. Describing of elements IO field in a testing computer program

    Directory of Open Access Journals (Sweden)

    Igor V. Loshkov

    2017-01-01

    Full Text Available A standard of describing the process of displaying interactive windows on a computer monitor, through which an output of questions and input of answers are implemented during computer testing, is presented in the article [11]. According to the proposed standard, the description of the process mentioned above is performed with a format line, containing element names, their parameters as well as grouping and auxiliary symbols. Program objects are described using elements of standard. The majority of objects create input and output windows on a computer monitor. The aim of our research was to develop a minimum possible set of elements of standard to perform mathematical and computer science testing.The choice of elements of the standard was conducted in parallel with the development and testing of the program that uses them. This approach made it possible to choose a sufficiently complete set of elements for testing in fields of study mentioned above. For the proposed elements, names were selected in such a way: firstly, they indicate their function and secondly, they coincide with the names of elements in other programming languages that are similar by function. Parameters, their names, their assignments and accepted values are proposed for the elements. The principle of name selection for the parameters was the same as for elements of the standard: the names should correspond to their assignments or coincide with names of similar parameters in other programming languages. The parameters define properties of objects. Particularly, while the elements of standard create windows, the parameters define object properties (location, size, appearance and the sequence in which windows are created. All elements of standard, proposed in this article are composed in a table, the columns of which have names and functions of these elements. Inside the table, the elements of standard are grouped row by row into four sets: input elements, output elements, input

  8. Transmitted wavefront testing with large dynamic range based on computer-aided deflectometry

    Science.gov (United States)

    Wang, Daodang; Xu, Ping; Gong, Zhidong; Xie, Zhongmin; Liang, Rongguang; Xu, Xinke; Kong, Ming; Zhao, Jun

    2018-06-01

    The transmitted wavefront testing technique is demanded for the performance evaluation of transmission optics and transparent glass, in which the achievable dynamic range is a key issue. A computer-aided deflectometric testing method with fringe projection is proposed for the accurate testing of transmitted wavefronts with a large dynamic range. Ray tracing of the modeled testing system is carried out to achieve the virtual ‘null’ testing of transmitted wavefront aberrations. The ray aberration is obtained from the ray tracing result and measured slope, with which the test wavefront aberration can be reconstructed. To eliminate testing system modeling errors, a system geometry calibration based on computer-aided reverse optimization is applied to realize accurate testing. Both numerical simulation and experiments have been carried out to demonstrate the feasibility and high accuracy of the proposed testing method. The proposed testing method can achieve a large dynamic range compared with the interferometric method, providing a simple, low-cost and accurate way for the testing of transmitted wavefronts from various kinds of optics and a large amount of industrial transmission elements.

  9. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    Science.gov (United States)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the

  10. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  11. Adaptive homodyne phase discrimination and qubit measurement

    International Nuclear Information System (INIS)

    Sarovar, Mohan; Whaley, K. Birgitta

    2007-01-01

    Fast and accurate measurement is a highly desirable, if not vital, feature of quantum computing architectures. In this work we investigate the usefulness of adaptive measurements in improving the speed and accuracy of qubit measurement. We examine a particular class of quantum computing architectures, ones based on qubits coupled to well-controlled harmonic oscillator modes (reminiscent of cavity QED), where adaptive schemes for measurement are particularly appropriate. In such architectures, qubit measurement is equivalent to phase discrimination for a mode of the electromagnetic field, and we examine adaptive techniques for doing this. In the final section we present a concrete example of applying adaptive measurement to the particularly well-developed circuit-QED architecture

  12. Adaptive finite element methods for differential equations

    CERN Document Server

    Bangerth, Wolfgang

    2003-01-01

    These Lecture Notes discuss concepts of `self-adaptivity' in the numerical solution of differential equations, with emphasis on Galerkin finite element methods. The key issues are a posteriori error estimation and it automatic mesh adaptation. Besides the traditional approach of energy-norm error control, a new duality-based technique, the Dual Weighted Residual method for goal-oriented error estimation, is discussed in detail. This method aims at economical computation of arbitrary quantities of physical interest by properly adapting the computational mesh. This is typically required in the design cycles of technical applications. For example, the drag coefficient of a body immersed in a viscous flow is computed, then it is minimized by varying certain control parameters, and finally the stability of the resulting flow is investigated by solving an eigenvalue problem. `Goal-oriented' adaptivity is designed to achieve these tasks with minimal cost. At the end of each chapter some exercises are posed in order ...

  13. Convergence acceleration of Navier-Stokes equation using adaptive wavelet method

    International Nuclear Information System (INIS)

    Kang, Hyung Min; Ghafoor, Imran; Lee, Do Hyung

    2010-01-01

    An efficient adaptive wavelet method is proposed for the enhancement of computational efficiency of the Navier-Stokes equations. The method is based on sparse point representation (SPR), which uses the wavelet decomposition and thresholding to obtain a sparsely distributed dataset. The threshold mechanism is modified in order to maintain the spatial accuracy of a conventional Navier-Stokes solver by adapting the threshold value to the order of spatial truncation error. The computational grid can be dynamically adapted to a transient solution to reflect local changes in the solution. The flux evaluation is then carried out only at the points of the adapted dataset, which reduces the computational effort and memory requirements. A stabilization technique is also implemented to avoid the additional numerical errors introduced by the threshold procedure. The numerical results of the adaptive wavelet method are compared with a conventional solver to validate the enhancement in computational efficiency of Navier-Stokes equations without the degeneration of the numerical accuracy of a conventional solver

  14. An Adaptive Classification Strategy for Reliable Locomotion Mode Recognition

    Directory of Open Access Journals (Sweden)

    Ming Liu

    2017-09-01

    Full Text Available Algorithms for locomotion mode recognition (LMR based on surface electromyography and mechanical sensors have recently been developed and could be used for the neural control of powered prosthetic legs. However, the variations in input signals, caused by physical changes at the sensor interface and human physiological changes, may threaten the reliability of these algorithms. This study aimed to investigate the effectiveness of applying adaptive pattern classifiers for LMR. Three adaptive classifiers, i.e., entropy-based adaptation (EBA, LearnIng From Testing data (LIFT, and Transductive Support Vector Machine (TSVM, were compared and offline evaluated using data collected from two able-bodied subjects and one transfemoral amputee. The offline analysis indicated that the adaptive classifier could effectively maintain or restore the performance of the LMR algorithm when gradual signal variations occurred. EBA and LIFT were recommended because of their better performance and higher computational efficiency. Finally, the EBA was implemented for real-time human-in-the-loop prosthesis control. The online evaluation showed that the applied EBA effectively adapted to changes in input signals across sessions and yielded more reliable prosthesis control over time, compared with the LMR without adaptation. The developed novel adaptive strategy may further enhance the reliability of neurally-controlled prosthetic legs.

  15. A Method for the Comparison of Item Selection Rules in Computerized Adaptive Testing

    Science.gov (United States)

    Barrada, Juan Ramon; Olea, Julio; Ponsoda, Vicente; Abad, Francisco Jose

    2010-01-01

    In a typical study comparing the relative efficiency of two item selection rules in computerized adaptive testing, the common result is that they simultaneously differ in accuracy and security, making it difficult to reach a conclusion on which is the more appropriate rule. This study proposes a strategy to conduct a global comparison of two or…

  16. Branched Adaptive Testing with a Rasch-Model-Calibrated Test: Analysing Item Presentation's Sequence Effects Using the Rasch-Model-Based LLTM

    Science.gov (United States)

    Kubinger, Klaus D.; Reif, Manuel; Yanagida, Takuya

    2011-01-01

    Item position effects provoke serious problems within adaptive testing. This is because different testees are necessarily presented with the same item at different presentation positions, as a consequence of which comparing their ability parameter estimations in the case of such effects would not at all be fair. In this article, a specific…

  17. Comparing Postsecondary Marketing Student Performance on Computer-Based and Handwritten Essay Tests

    Science.gov (United States)

    Truell, Allen D.; Alexander, Melody W.; Davis, Rodney E.

    2004-01-01

    The purpose of this study was to determine if there were differences in postsecondary marketing student performance on essay tests based on test format (i.e., computer-based or handwritten). Specifically, the variables of performance, test completion time, and gender were explored for differences based on essay test format. Results of the study…

  18. The analog computation and contrast test of leaked electromagnetic noise in the klystron corridor

    International Nuclear Information System (INIS)

    Tao Xiaoping; Wang Guicheng

    2001-01-01

    In order to obtain a better understand of the characteristics and location of noise source, the leaked electromagnetic noise in the klystron corridor of NSRL has been analogously calculated. The computational method and formula of high frequency leaked noise of the modulator were given. On-the-spot contrast tests have been made on the base of analog computation. The contrast test results show reasonableness of analog computation and whereby offer a theoretic base for reducing noise leakage in corridor

  19. Computationally efficient implementation of sarse-tap FIR adaptive filters with tap-position control on intel IA-32 processors

    OpenAIRE

    Hirano, Akihiro; Nakayama, Kenji

    2008-01-01

    This paper presents an computationally ef cient implementation of sparse-tap FIR adaptive lters with tapposition control on Intel IA-32 processors with single-instruction multiple-data (SIMD) capability. In order to overcome randomorder memory access which prevents a ectorization, a blockbased processing and a re-ordering buffer are introduced. A dynamic register allocation and the use of memory-to-register operations help the maximization of the loop-unrolling level. Up to 66percent speedup ...

  20. Adaptive Wavelet Coding Applied in a Wireless Control System.

    Science.gov (United States)

    Gama, Felipe O S; Silveira, Luiz F Q; Salazar, Andrés O

    2017-12-13

    Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER) versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  1. Adaptive Wavelet Coding Applied in a Wireless Control System

    Directory of Open Access Journals (Sweden)

    Felipe O. S. Gama

    2017-12-01

    Full Text Available Wireless control systems can sense, control and act on the information exchanged between the wireless sensor nodes in a control loop. However, the exchanged information becomes susceptible to the degenerative effects produced by the multipath propagation. In order to minimize the destructive effects characteristic of wireless channels, several techniques have been investigated recently. Among them, wavelet coding is a good alternative for wireless communications for its robustness to the effects of multipath and its low computational complexity. This work proposes an adaptive wavelet coding whose parameters of code rate and signal constellation can vary according to the fading level and evaluates the use of this transmission system in a control loop implemented by wireless sensor nodes. The performance of the adaptive system was evaluated in terms of bit error rate (BER versus E b / N 0 and spectral efficiency, considering a time-varying channel with flat Rayleigh fading, and in terms of processing overhead on a control system with wireless communication. The results obtained through computational simulations and experimental tests show performance gains obtained by insertion of the adaptive wavelet coding in a control loop with nodes interconnected by wireless link. These results enable the use of this technique in a wireless link control loop.

  2. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.; Dawson, Clint N.

    2014-01-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  3. Adaptive mesh refinement for storm surge

    KAUST Repository

    Mandli, Kyle T.

    2014-03-01

    An approach to utilizing adaptive mesh refinement algorithms for storm surge modeling is proposed. Currently numerical models exist that can resolve the details of coastal regions but are often too costly to be run in an ensemble forecasting framework without significant computing resources. The application of adaptive mesh refinement algorithms substantially lowers the computational cost of a storm surge model run while retaining much of the desired coastal resolution. The approach presented is implemented in the GeoClaw framework and compared to ADCIRC for Hurricane Ike along with observed tide gauge data and the computational cost of each model run. © 2014 Elsevier Ltd.

  4. Anticipated simulation of Angra-1 start-up physical tests

    International Nuclear Information System (INIS)

    Fernandes, V.B.; Perrotta, J.A.; Silva Ipojuca, T. da; Ponzoni Filho, P.

    1981-01-01

    Some results foreseen by the Department of Nuclear Fuel (DCN. O) in Furnas for the measurements that will be realized during the start-up integrated tests for Angra-1 are presented. All the forecasting is based on a DCN.O proper correlation methodology, developed from basic physical principles, using computer codes developed by the authors or public computer codes adapted to this methodology. (E.G.) [pt

  5. Experimental and Computational Study of Ductile Fracture in Small Punch Tests

    Directory of Open Access Journals (Sweden)

    Betül Gülçimen Çakan

    2017-10-01

    Full Text Available A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  6. Experimental and Computational Study of Ductile Fracture in Small Punch Tests.

    Science.gov (United States)

    Gülçimen Çakan, Betül; Soyarslan, Celal; Bargmann, Swantje; Hähner, Peter

    2017-10-17

    A unified experimental-computational study on ductile fracture initiation and propagation during small punch testing is presented. Tests are carried out at room temperature with unnotched disks of different thicknesses where large-scale yielding prevails. In thinner specimens, the fracture occurs with severe necking under membrane tension, whereas for thicker ones a through thickness shearing mode prevails changing the crack orientation relative to the loading direction. Computational studies involve finite element simulations using a shear modified Gurson-Tvergaard-Needleman porous plasticity model with an integral-type nonlocal formulation. The predicted punch load-displacement curves and deformed profiles are in good agreement with the experimental results.

  7. Modeling of heterogeneous elastic materials by the multiscale hp-adaptive finite element method

    Science.gov (United States)

    Klimczak, Marek; Cecot, Witold

    2018-01-01

    We present an enhancement of the multiscale finite element method (MsFEM) by combining it with the hp-adaptive FEM. Such a discretization-based homogenization technique is a versatile tool for modeling heterogeneous materials with fast oscillating elasticity coefficients. No assumption on periodicity of the domain is required. In order to avoid direct, so-called overkill mesh computations, a coarse mesh with effective stiffness matrices is used and special shape functions are constructed to account for the local heterogeneities at the micro resolution. The automatic adaptivity (hp-type at the macro resolution and h-type at the micro resolution) increases efficiency of computation. In this paper details of the modified MsFEM are presented and a numerical test performed on a Fichera corner domain is presented in order to validate the proposed approach.

  8. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  9. Reducing adapter synthesis to controller synthesis

    NARCIS (Netherlands)

    Gierds, C.; Mooij, A.J.; Wolf, K.

    2012-01-01

    Service-oriented computing aims to create complex systems by composing less-complex systems, called services. Since services can be developed independently, the integration of services requires an adaptation mechanism for bridging any incompatibilities. Behavioral adapters aim to adjust the

  10. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    Science.gov (United States)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and

  11. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment.

    Science.gov (United States)

    Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.

  12. Water System Adaptation To Hydrological Changes: Module 12, Models and Tools for Stormwater and Wastewater System Adaptation

    Science.gov (United States)

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  13. Adaptive hybrid mesh refinement for multiphysics applications

    International Nuclear Information System (INIS)

    Khamayseh, Ahmed; Almeida, Valmor de

    2007-01-01

    The accuracy and convergence of computational solutions of mesh-based methods is strongly dependent on the quality of the mesh used. We have developed methods for optimizing meshes that are comprised of elements of arbitrary polygonal and polyhedral type. We present in this research the development of r-h hybrid adaptive meshing technology tailored to application areas relevant to multi-physics modeling and simulation. Solution-based adaptation methods are used to reposition mesh nodes (r-adaptation) or to refine the mesh cells (h-adaptation) to minimize solution error. The numerical methods perform either the r-adaptive mesh optimization or the h-adaptive mesh refinement method on the initial isotropic or anisotropic meshes to equidistribute weighted geometric and/or solution error function. We have successfully introduced r-h adaptivity to a least-squares method with spherical harmonics basis functions for the solution of the spherical shallow atmosphere model used in climate modeling. In addition, application of this technology also covers a wide range of disciplines in computational sciences, most notably, time-dependent multi-physics, multi-scale modeling and simulation

  14. Psychometric Evaluation of the Italian Adaptation of the Test of Inferential and Creative Thinking

    Science.gov (United States)

    Faraci, Palmira; Hell, Benedikt; Schuler, Heinz

    2016-01-01

    This article describes the psychometric properties of the Italian adaptation of the "Analyse des Schlussfolgernden und Kreativen Denkens" (ASK; Test of Inferential and Creative Thinking) for measuring inferential and creative thinking. The study aimed to (a) supply evidence for the factorial structure of the instrument, (b) describe its…

  15. Partial update least-square adaptive filtering

    CERN Document Server

    Xie, Bei

    2014-01-01

    Adaptive filters play an important role in the fields related to digital signal processing and communication, such as system identification, noise cancellation, channel equalization, and beamforming. In practical applications, the computational complexity of an adaptive filter is an important consideration. The Least Mean Square (LMS) algorithm is widely used because of its low computational complexity (O(N)) and simplicity in implementation. The least squares algorithms, such as Recursive Least Squares (RLS), Conjugate Gradient (CG), and Euclidean Direction Search (EDS), can converge faster a

  16. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    Science.gov (United States)

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  17. Adaptive filtering and change detection

    CERN Document Server

    Gustafsson, Fredrik

    2003-01-01

    Adaptive filtering is a classical branch of digital signal processing (DSP). Industrial interest in adaptive filtering grows continuously with the increase in computer performance that allows ever more conplex algorithms to be run in real-time. Change detection is a type of adaptive filtering for non-stationary signals and is also the basic tool in fault detection and diagnosis. Often considered as separate subjects Adaptive Filtering and Change Detection bridges a gap in the literature with a unified treatment of these areas, emphasizing that change detection is a natural extensi

  18. Computer-aided testing and operational aids for PARR-1 nuclear reactor

    International Nuclear Information System (INIS)

    Ansari, S.A.

    1990-01-01

    The utilization of the plant computer of Pakistan Research Reactor (PARR-1) for automatic periodic testing of nuclear instrumentation in the reactor is described. Computer algorithms have been developed for on-line acquisition and real-time processing of nuclear channel signals. The mean value, standard deviation, and probability distributions of nuclear channel signals are obtained in real time, and the computer generates a warning message if the signal error exceeds the maximum permissible error. In this way a faulty channel is automatically identified. Other real-time algorithms are also described that assist the operator in safe reactor operation by automatically computing approach-to-criticality during reactor start-up and the control rod worth determination

  19. Applying computerized adaptive testing to the Negative Acts Questionnaire-Revised: Rasch analysis of workplace bullying.

    Science.gov (United States)

    Ma, Shu-Ching; Chien, Tsair-Wei; Wang, Hsiu-Hung; Li, Yu-Chi; Yui, Mei-Shu

    2014-02-17

    Workplace bullying is a prevalent problem in contemporary work places that has adverse effects on both the victims of bullying and organizations. With the rapid development of computer technology in recent years, there is an urgent need to prove whether item response theory-based computerized adaptive testing (CAT) can be applied to measure exposure to workplace bullying. The purpose of this study was to evaluate the relative efficiency and measurement precision of a CAT-based test for hospital nurses compared to traditional nonadaptive testing (NAT). Under the preliminary conditions of a single domain derived from the scale, a CAT module bullying scale model with polytomously scored items is provided as an example for evaluation purposes. A total of 300 nurses were recruited and responded to the 22-item Negative Acts Questionnaire-Revised (NAQ-R). All NAT (or CAT-selected) items were calibrated with the Rasch rating scale model and all respondents were randomly selected for a comparison of the advantages of CAT and NAT in efficiency and precision by paired t tests and the area under the receiver operating characteristic curve (AUROC). The NAQ-R is a unidimensional construct that can be applied to measure exposure to workplace bullying through CAT-based administration. Nursing measures derived from both tests (CAT and NAT) were highly correlated (r=.97) and their measurement precisions were not statistically different (P=.49) as expected. CAT required fewer items than NAT (an efficiency gain of 32%), suggesting a reduced burden for respondents. There were significant differences in work tenure between the 2 groups (bullied and nonbullied) at a cutoff point of 6 years at 1 worksite. An AUROC of 0.75 (95% CI 0.68-0.79) with logits greater than -4.2 (or >30 in summation) was defined as being highly likely bullied in a workplace. With CAT-based administration of the NAQ-R for nurses, their burden was substantially reduced without compromising measurement precision.

  20. Adaptive learning with covariate shift-detection for motor imagery-based brain–computer interface

    OpenAIRE

    Raza, H; Cecotti, H; Li, Y; Prasad, G

    2015-01-01

    A common assumption in traditional supervised learning is the similar probability distribution of data between the training phase and the testing/operating phase. When transitioning from the training to testing phase, a shift in the probability distribution of input data is known as a covariate shift. Covariate shifts commonly arise in a wide range of real-world systems such as electroencephalogram-based brain–computer interfaces (BCIs). In such systems, there is a necessity for continuous mo...

  1. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    Science.gov (United States)

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    Science.gov (United States)

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  3. Relationship Between Ocular Surface Disease Index, Dry Eye Tests, and Demographic Properties in Computer Users

    Directory of Open Access Journals (Sweden)

    Hüseyin Simavlı

    2014-03-01

    Full Text Available Objectives: The aim of the present study is to evaluate the ocular surface disease index (OSDI in computer users and to investigate the correlations of this index with dry eye tests and demographic properties. Materials and Methods: In this prospective study, 178 subjects with an age range of 20-40 years and who spent most of their daily life in front of the computers were included. All participants underwent a complete ophthalmologic examination including basal secretion test, tear break-up time test, and ocular surface staining. In addition, all patients completed the OSDI questionnaire. Results: A total of 178 volunteers (101 female, 77 male with a mean age of 28.8±4.5 years were included in the study. Mean time of computer use was 7.7±1.9 (5-14 hours/day, and mean computer use period was 71.1±39.7 (4-204 months. Mean OSDI score was 44.1±24.7 (0-100. There was a significant negative correlation between the OSDI score and tear break-up time test in the right (p=0.005 r=-0.21 and the left eyes (p=0.003 r=-0.22. There was a significant positive correlation between the OSDI score and gender (p=0.014 r=0.18 and daily computer usage time (p=0.008 r=0.2. In addition to this, there was a significant positive correlation between the OSDI score and ocular surface staining pattern in the right (p=0.03 r=0.16 and the left eyes (p=0.03 r=0.17. Age, smoking, type of computer, use of glasses, presence of symptoms, and basal secretion test were not found to be correlated with OSDI score. Conclusions: Long-term computer use causes ocular surface problems. The OSDI were found to be correlated with tear break-up time test, gender, daily computer usage time, and ocular surface staining pattern in computer users. (Turk J Ophthalmol 2014; 44: 115-8

  4. Efficient p-value evaluation for resampling-based tests

    KAUST Repository

    Yu, K.

    2011-01-05

    The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.

  5. Computational analysis in support of the SSTO flowpath test

    Science.gov (United States)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-10-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  6. Adaptive x-ray threat detection using sequential hypotheses testing with fan-beam experimental data (Conference Presentation)

    Science.gov (United States)

    Thamvichai, Ratchaneekorn; Huang, Liang-Chih; Ashok, Amit; Gong, Qian; Coccarelli, David; Greenberg, Joel A.; Gehm, Michael E.; Neifeld, Mark A.

    2017-05-01

    We employ an adaptive measurement system, based on sequential hypotheses testing (SHT) framework, for detecting material-based threats using experimental data acquired on an X-ray experimental testbed system. This testbed employs 45-degree fan-beam geometry and 15 views over a 180-degree span to generate energy sensitive X-ray projection data. Using this testbed system, we acquire multiple view projection data for 200 bags. We consider an adaptive measurement design where the X-ray projection measurements are acquired in a sequential manner and the adaptation occurs through the choice of the optimal "next" source/view system parameter. Our analysis of such an adaptive measurement design using the experimental data demonstrates a 3x-7x reduction in the probability of error relative to a static measurement design. Here the static measurement design refers to the operational system baseline that corresponds to a sequential measurement using all the available sources/views. We also show that by using adaptive measurements it is possible to reduce the number of sources/views by nearly 50% compared a system that relies on static measurements.

  7. Development of a lack of appetite item bank for computer-adaptive testing (CAT)

    DEFF Research Database (Denmark)

    Thamsborg, Lise Laurberg Holst; Petersen, Morten Aa; Aaronson, Neil K

    2015-01-01

    to 12 lack of appetite items. CONCLUSIONS: Phases 1-3 resulted in 12 lack of appetite candidate items. Based on a field testing (phase 4), the psychometric characteristics of the items will be assessed and the final item bank will be generated. This CAT item bank is expected to provide precise...

  8. Computer-based testing of the modified essay question: the Singapore experience.

    Science.gov (United States)

    Lim, Erle Chuen-Hian; Seet, Raymond Chee-Seong; Oh, Vernon M S; Chia, Boon-Lock; Aw, Marion; Quak, Seng-Hock; Ong, Benjamin K C

    2007-11-01

    The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall. Although it is traditionally conducted as a pen-and-paper examination, our university has run the MEQ using computer-based testing (CBT) since 2003. We describe our experience with running the MEQ examination using the IVLE, or integrated virtual learning environment (https://ivle.nus.edu.sg), provide a blueprint for universities intending to conduct computer-based testing of the MEQ, and detail how our MEQ examination has evolved since its inception. An MEQ committee, comprising specialists in key disciplines from the departments of Medicine and Paediatrics, was formed. We utilized the IVLE, developed for our university in 1998, as the online platform on which we ran the MEQ. We calculated the number of man-hours (academic and support staff) required to run the MEQ examination, using either a computer-based or pen-and-paper format. With the support of our university's information technology (IT) specialists, we have successfully run the MEQ examination online, twice a year, since 2003. Initially, we conducted the examination with short-answer questions only, but have since expanded the MEQ examination to include multiple-choice and extended matching questions. A total of 1268 man-hours was spent in preparing for, and running, the MEQ examination using CBT, compared to 236.5 man-hours to run it using a pen-and-paper format. Despite being more labour-intensive, our students and staff prefer CBT to the pen-and-paper format. The MEQ can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. We hope to increase the number of questions and incorporate audio and video files, featuring clinical vignettes, to the MEQ examination in the near future.

  9. A Review of Models for Computer-Based Testing. Research Report 2011-12

    Science.gov (United States)

    Luecht, Richard M.; Sireci, Stephen G.

    2011-01-01

    Over the past four decades, there has been incremental growth in computer-based testing (CBT) as a viable alternative to paper-and-pencil testing. However, the transition to CBT is neither easy nor inexpensive. As Drasgow, Luecht, and Bennett (2006) noted, many design engineering, test development, operations/logistics, and psychometric changes…

  10. Grid adaptation using chimera composite overlapping meshes

    Science.gov (United States)

    Kao, Kai-Hsiung; Liou, Meng-Sing; Chow, Chuen-Yen

    1994-01-01

    The objective of this paper is to perform grid adaptation using composite overlapping meshes in regions of large gradient to accurately capture the salient features during computation. The chimera grid scheme, a multiple overset mesh technique, is used in combination with a Navier-Stokes solver. The numerical solution is first converged to a steady state based on an initial coarse mesh. Solution-adaptive enhancement is then performed by using a secondary fine grid system which oversets on top of the base grid in the high-gradient region, but without requiring the mesh boundaries to join in any special way. Communications through boundary interfaces between those separated grids are carried out using trilinear interpolation. Application to the Euler equations for shock reflections and to shock wave/boundary layer interaction problem are tested. With the present method, the salient features are well-resolved.

  11. Adaptive Robot Control – An Experimental Comparison

    Directory of Open Access Journals (Sweden)

    Francesco Alonge

    2012-11-01

    Full Text Available This paper deals with experimental comparison between stable adaptive controllers of robotic manipulators based on Model Based Adaptive, Neural Network and Wavelet -Based control. The above control methods were compared with each other in terms of computational efficiency, need for accurate mathematical model of the manipulator and tracking performances. An original management algorithm of the Wavelet Network control scheme has been designed, with the aim of constructing the net automatically during the trajectory tracking, without the need to tune it to the trajectory itself. Experimental tests, carried out on a planar two link manipulator, show that the Wavelet-Based control scheme, with the new management algorithm, outperforms the conventional Model-Based schemes in the presence of structural uncertainties in the mathematical model of the robot, without pre-training and more efficiently than the Neural Network approach.

  12. Full-Scaled Advanced Systems Testbed: Ensuring Success of Adaptive Control Research Through Project Lifecycle Risk Mitigation

    Science.gov (United States)

    Pavlock, Kate M.

    2011-01-01

    The National Aeronautics and Space Administration's Dryden Flight Research Center completed flight testing of adaptive controls research on the Full-Scale Advance Systems Testbed (FAST) in January of 2011. The research addressed technical challenges involved with reducing risk in an increasingly complex and dynamic national airspace. Specific challenges lie with the development of validated, multidisciplinary, integrated aircraft control design tools and techniques to enable safe flight in the presence of adverse conditions such as structural damage, control surface failures, or aerodynamic upsets. The testbed is an F-18 aircraft serving as a full-scale vehicle to test and validate adaptive flight control research and lends a significant confidence to the development, maturation, and acceptance process of incorporating adaptive control laws into follow-on research and the operational environment. The experimental systems integrated into FAST were designed to allow for flexible yet safe flight test evaluation and validation of modern adaptive control technologies and revolve around two major hardware upgrades: the modification of Production Support Flight Control Computers (PSFCC) and integration of two, fourth-generation Airborne Research Test Systems (ARTS). Post-hardware integration verification and validation provided the foundation for safe flight test of Nonlinear Dynamic Inversion and Model Reference Aircraft Control adaptive control law experiments. To ensure success of flight in terms of cost, schedule, and test results, emphasis on risk management was incorporated into early stages of design and flight test planning and continued through the execution of each flight test mission. Specific consideration was made to incorporate safety features within the hardware and software to alleviate user demands as well as into test processes and training to reduce human factor impacts to safe and successful flight test. This paper describes the research configuration

  13. Bacterial computing: a form of natural computing and its applications.

    Science.gov (United States)

    Lahoz-Beltra, Rafael; Navarro, Jorge; Marijuán, Pedro C

    2014-01-01

    The capability to establish adaptive relationships with the environment is an essential characteristic of living cells. Both bacterial computing and bacterial intelligence are two general traits manifested along adaptive behaviors that respond to surrounding environmental conditions. These two traits have generated a variety of theoretical and applied approaches. Since the different systems of bacterial signaling and the different ways of genetic change are better known and more carefully explored, the whole adaptive possibilities of bacteria may be studied under new angles. For instance, there appear instances of molecular "learning" along the mechanisms of evolution. More in concrete, and looking specifically at the time dimension, the bacterial mechanisms of learning and evolution appear as two different and related mechanisms for adaptation to the environment; in somatic time the former and in evolutionary time the latter. In the present chapter it will be reviewed the possible application of both kinds of mechanisms to prokaryotic molecular computing schemes as well as to the solution of real world problems.

  14. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  15. Efficient computation of the elastography inverse problem by combining variational mesh adaption and a clustering technique

    International Nuclear Information System (INIS)

    Arnold, Alexander; Bruhns, Otto T; Reichling, Stefan; Mosler, Joern

    2010-01-01

    This paper is concerned with an efficient implementation suitable for the elastography inverse problem. More precisely, the novel algorithm allows us to compute the unknown stiffness distribution in soft tissue by means of the measured displacement field by considerably reducing the numerical cost compared to previous approaches. This is realized by combining and further elaborating variational mesh adaption with a clustering technique similar to those known from digital image compression. Within the variational mesh adaption, the underlying finite element discretization is only locally refined if this leads to a considerable improvement of the numerical solution. Additionally, the numerical complexity is reduced by the aforementioned clustering technique, in which the parameters describing the stiffness of the respective soft tissue are sorted according to a predefined number of intervals. By doing so, the number of unknowns associated with the elastography inverse problem can be chosen explicitly. A positive side effect of this method is the reduction of artificial noise in the data (smoothing of the solution). The performance and the rate of convergence of the resulting numerical formulation are critically analyzed by numerical examples.

  16. Gender, general theory of crime and computer crime: an empirical test.

    Science.gov (United States)

    Moon, Byongook; McCluskey, John D; McCluskey, Cynthia P; Lee, Sangwon

    2013-04-01

    Regarding the gender gap in computer crime, studies consistently indicate that boys are more likely than girls to engage in various types of computer crime; however, few studies have examined the extent to which traditional criminology theories account for gender differences in computer crime and the applicability of these theories in explaining computer crime across gender. Using a panel of 2,751 Korean youths, the current study tests the applicability of the general theory of crime in explaining the gender gap in computer crime and assesses the theory's utility in explaining computer crime across gender. Analyses show that self-control theory performs well in predicting illegal use of others' resident registration number (RRN) online for both boys and girls, as predicted by the theory. However, low self-control, a dominant criminogenic factor in the theory, fails to mediate the relationship between gender and computer crime and is inadequate in explaining illegal downloading of software in both boy and girl models. Theoretical implication of the findings and the directions for future research are discussed.

  17. Computerized Tests. New practical and ethical challenges for Psychological Assessment

    Directory of Open Access Journals (Sweden)

    Gabriela Susana Lozzia

    2013-08-01

    Full Text Available The purpose of this study is to bring the readers in our field of knowledge closer to the new problems and solutions resulting from the application of computer systems to Psychological Assessment. Therefore, this work puts forward a suitable implementation of Computer-based and Internet-delivered Testing, includes a description of the new technologies that can be applied to Psychological Assessment: administration of traditional paper-and-pencil tests through computers, elaboration of automated reports, computerized adaptive tests, automated test construction and automatic generation of items, as well as the specific guidelines and regulations governing the development of each of these areas. This study provides an outline of the current issues connected with the appropriate use of Computerized Tests by way of conclusion and finally encourages psychologists to keep debating and reflecting on these topics.

  18. Computer-Aided System of Virtual Testing of Gas Turbine Engines

    Directory of Open Access Journals (Sweden)

    Rybakov Viktor N.

    2016-01-01

    Full Text Available The article describes the concept of a virtual lab that includes subsystem of gas turbine engine simulation, subsystem of experiment planning, subsystem of measurement errors simulation, subsystem of simulator identification and others. The basis for virtual lab development is the computer-aided system of thermogasdynamic research and analysis “ASTRA”. The features of gas turbine engine transient modes simulator are described. The principal difference between the simulators of transient and stationary modes of gas turbine engines is that the energy balance of the compressor and turbine becomes not applicable. The computer-aided system of virtual gas turbine engine testing was created using the developed transient modes simulator. This system solves the tasks of operational (throttling, speed, climatic, altitude characteristics calculation, analysis of transient dynamics and selection of optimal control laws. Besides, the system of virtual gas turbine engine testing is a clear demonstration of gas turbine engine working process and the regularities of engine elements collaboration. The interface of the system of virtual gas turbine engine testing is described in the article and some screenshots of the interface elements are provided. The developed system of virtual gas turbine engine testing provides means for reducing the laboriousness of gas turbine engines testing. Besides, the implementation of this system in the learning process allows the diversification of lab works and therefore improve the quality of training.

  19. Visualization of flaws within heavy section ultrasonic test blocks using high energy computed tomography

    International Nuclear Information System (INIS)

    House, M.B.; Ross, D.M.; Janucik, F.X.; Friedman, W.D.; Yancey, R.N.

    1996-05-01

    The feasibility of high energy computed tomography (9 MeV) to detect volumetric and planar discontinuities in large pressure vessel mock-up blocks was studied. The data supplied by the manufacturer of the test blocks on the intended flaw geometry were compared to manual, contact ultrasonic test and computed tomography test data. Subsequently, a visualization program was used to construct fully three-dimensional morphological information enabling interactive data analysis on the detected flaws. Density isosurfaces show the relative shape and location of the volumetric defects within the mock-up blocks. Such a technique may be used to qualify personnel or newly developed ultrasonic test methods without the associated high cost of destructive evaluation. Data is presented showing the capability of the volumetric data analysis program to overlay the computed tomography and destructive evaluation (serial metallography) data for a direct, three-dimensional comparison

  20. Introducing Computer-Based Testing in High-Stakes Exams in Higher Education: Results of a Field Experiment

    Science.gov (United States)

    Boevé, Anja J.; Meijer, Rob R.; Albers, Casper J.; Beetsma, Yta; Bosker, Roel J.

    2015-01-01

    The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration. PMID:26641632