Chen, Wei James; Krajbich, Ian
Models of reinforcement learning (RL) are prevalent in the decision-making literature, but not all behavior seems to conform to the gradual convergence that is a central feature of RL. In some cases learning seems to happen all at once. Limited prior research on these "epiphanies" has shown evidence of sudden changes in behavior, but it remains unclear how such epiphanies occur. We propose a sequential-sampling model of epiphany learning (EL) and test it using an eye-tracking experiment. In the experiment, subjects repeatedly play a strategic game that has an optimal strategy. Subjects can learn over time from feedback but are also allowed to commit to a strategy at any time, eliminating all other options and opportunities to learn. We find that the EL model is consistent with the choices, eye movements, and pupillary responses of subjects who commit to the optimal strategy (correct epiphany) but not always of those who commit to a suboptimal strategy or who do not commit at all. Our findings suggest that EL is driven by a latent evidence accumulation process that can be revealed with eye-tracking data.
Reinkensmeyer, David J; Burdet, Etienne; Casadio, Maura; Krakauer, John W; Kwakkel, Gert; Lang, Catherine E; Swinnen, Stephan P; Ward, Nick S; Schweighofer, Nicolas
Despite progress in using computational approaches to inform medicine and neuroscience in the last 30 years, there have been few attempts to model the mechanisms underlying sensorimotor rehabilitation. We argue that a fundamental understanding of neurologic recovery, and as a result accurate predictions at the individual level, will be facilitated by developing computational models of the salient neural processes, including plasticity and learning systems of the brain, and integrating them into a context specific to rehabilitation. Here, we therefore discuss Computational Neurorehabilitation, a newly emerging field aimed at modeling plasticity and motor learning to understand and improve movement recovery of individuals with neurologic impairment. We first explain how the emergence of robotics and wearable sensors for rehabilitation is providing data that make development and testing of such models increasingly feasible. We then review key aspects of plasticity and motor learning that such models will incorporate. We proceed by discussing how computational neurorehabilitation models relate to the current benchmark in rehabilitation modeling - regression-based, prognostic modeling. We then critically discuss the first computational neurorehabilitation models, which have primarily focused on modeling rehabilitation of the upper extremity after stroke, and show how even simple models have produced novel ideas for future investigation. Finally, we conclude with key directions for future research, anticipating that soon we will see the emergence of mechanistic models of motor recovery that are informed by clinical imaging results and driven by the actual movement content of rehabilitation therapy as well as wearable sensor-based records of daily activity.
Calamaro, Shira; Jarosz, Gaja
Phonological rules create alternations in the phonetic realizations of related words. These rules must be learned by infants in order to identify the phonological inventory, the morphological structure, and the lexicon of a language. Recent work proposes a computational model for the learning of one kind of phonological alternation, allophony (Peperkamp, Le Calvez, Nadal, & Dupoux, 2006). This paper extends the model to account for learning of a broader set of phonological alternations and the formalization of these alternations as general rules. In Experiment 1, we apply the original model to new data in Dutch and demonstrate its limitations in learning nonallophonic rules. In Experiment 2, we extend the model to allow it to learn general rules for alternations that apply to a class of segments. In Experiment 3, the model is further extended to allow for generalization by context; we argue that this generalization must be constrained by linguistic principles. Copyright © 2014 Cognitive Science Society, Inc.
Puviani, Luca; Rama, Sidita
Nowadays, the experimental study of emotional learning is commonly based on classical conditioning paradigms and models, which have been thoroughly investigated in the last century. Unluckily, models based on classical conditioning are unable to explain or predict important psychophysiological phenomena, such as the failure of the extinction of emotional responses in certain circumstances (for instance, those observed in evaluative conditioning, in post-traumatic stress disorders and in panic attacks). In this manuscript, starting from the experimental results available from the literature, a computational model of implicit emotional learning based both on prediction errors computation and on statistical inference is developed. The model quantitatively predicts (a) the occurrence of evaluative conditioning, (b) the dynamics and the resistance-to-extinction of the traumatic emotional responses, (c) the mathematical relation between classical conditioning and unconditioned stimulus revaluation. Moreover, we discuss how the derived computational model can lead to the development of new animal models for resistant-to-extinction emotional reactions and novel methodologies of emotions modulation.
CERN. Geneva; NACHMAN, Ben
In this talk we present recent developments in the application of machine learning, computer vision, and probabilistic models to the analysis and interpretation of LHC events. First, we will introduce the concept of jet-images and computer vision techniques for jet tagging. Jet images enabled the connection between jet substructure and tagging with the fields of computer vision and image processing for the first time, improving the performance to identify highly boosted W bosons with respect to state-of-the-art methods, and providing a new way to visualize the discriminant features of different classes of jets, adding a new capability to understand the physics within jets and to design more powerful jet tagging methods. Second, we will present Fuzzy jets: a new paradigm for jet clustering using machine learning methods. Fuzzy jets view jet clustering as an unsupervised learning task and incorporate a probabilistic assignment of particles to jets to learn new features of the jet structure. In particular, we wi...
Full Text Available Today, not only diverse design-related disciplines are required to actively deal with the digitization of information and its potentials and side effects for education processes. In Germany, technology didactics developed in vocational education and computer science education in general education, both separated from media pedagogy as an after-school program. Media education is not a subject in German schools yet. However, in the paper we argue for an interdisciplinary approach to learn about computational modeling in creative processes and aesthetic contexts. It crosses the borders of programming technology, arts and design processes in meaningful contexts. Educational scenarios using smart textile environments are introduced and reflected for project based learning.
Seok, Soonhwa; DaCosta, Boaventura; Kinsell, Carolyn; Poggio, John C.; Meyen, Edward L.
This article proposes a computer-mediated intersensory learning model as an alternative to traditional instructional approaches for students with learning disabilities (LDs) in the inclusive classroom. Predominant practices of classroom inclusion today reflect the six principles of zero reject, nondiscriminatory evaluation, appropriate education,…
Carroll, Susanne E.
Criticizes the computer modelling experiments conducted by Sokolik and Smith (1992), which involved the learning of French gender attribution using connectionist architecture. The article argues that the experiments greatly oversimplified the complexity of gender learning, in that they were designed in such a way that knowledge that must be…
Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit
The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…
Jones, Thomas L.
A basic unsolved problem in science is that of understanding learning, the process by which people and machines use their experience in a situation to guide future action in similar situations. The ideas of Piaget, Pavlov, Hull, and other learning theorists, as well as previous heuristic programing models of human intelligence, stimulated this…
Sharp, Madeleine E; Foerde, Karin; Daw, Nathaniel D; Shohamy, Daphna
Patients with loss of dopamine due to Parkinson's disease are impaired at learning from reward. However, it remains unknown precisely which aspect of learning is impaired. In particular, learning from reward, or reinforcement learning, can be driven by two distinct computational processes. One involves habitual stamping-in of stimulus-response associations, hypothesized to arise computationally from 'model-free' learning. The other, 'model-based' learning, involves learning a model of the world that is believed to support goal-directed behaviour. Much work has pointed to a role for dopamine in model-free learning. But recent work suggests model-based learning may also involve dopamine modulation, raising the possibility that model-based learning may contribute to the learning impairment in Parkinson's disease. To directly test this, we used a two-step reward-learning task which dissociates model-free versus model-based learning. We evaluated learning in patients with Parkinson's disease tested ON versus OFF their dopamine replacement medication and in healthy controls. Surprisingly, we found no effect of disease or medication on model-free learning. Instead, we found that patients tested OFF medication showed a marked impairment in model-based learning, and that this impairment was remediated by dopaminergic medication. Moreover, model-based learning was positively correlated with a separate measure of working memory performance, raising the possibility of common neural substrates. Our results suggest that some learning deficits in Parkinson's disease may be related to an inability to pursue reward based on complete representations of the environment. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: email@example.com.
Omer Faruk Sozcu
Full Text Available The effects of computer-assisted and distance learning of geometric modeling and computer aided geometric design are studied. It was shown that computer algebra systems and dynamic geometric environments can be considered as excellent tools for teaching mathematical concepts of mentioned areas, and distance education technologies would be indispensable for consolidation of successfully passed topics
Kateryna R. Kolos
Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.
Borja, Ronaldo Israel
.... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...
learning during declarative control. 8. Journal of Experimental Psychology : Learning, Memory , and Cognition . 9. Crossley, M. J., Ashby, F. G., & Maddox...learning: Sensitivity to feedback timing. Frontiers in Psychology – Cognitive Science, 5, article 643, 1-9. 15. Worthy, D.A. & Maddox, W.T. (2014). A...Learning, Memory , and Cognition . Crossley, M. J., Ashby, F. G., & Maddox, W. T. (2014). Context-dependent savings in procedural category learning
Najnin, Shamima; Banerjee, Bonny
Cross-situational learning and social pragmatic theories are prominent mechanisms for learning word meanings (i.e., word-object pairs). In this paper, the role of reinforcement is investigated for early word-learning by an artificial agent. When exposed to a group of speakers, the agent comes to understand an initial set of vocabulary items belonging to the language used by the group. Both cross-situational learning and social pragmatic theory are taken into account. As social cues, joint attention and prosodic cues in caregiver's speech are considered. During agent-caregiver interaction, the agent selects a word from the caregiver's utterance and learns the relations between that word and the objects in its visual environment. The "novel words to novel objects" language-specific constraint is assumed for computing rewards. The models are learned by maximizing the expected reward using reinforcement learning algorithms [i.e., table-based algorithms: Q-learning, SARSA, SARSA-λ, and neural network-based algorithms: Q-learning for neural network (Q-NN), neural-fitted Q-network (NFQ), and deep Q-network (DQN)]. Neural network-based reinforcement learning models are chosen over table-based models for better generalization and quicker convergence. Simulations are carried out using mother-infant interaction CHILDES dataset for learning word-object pairings. Reinforcement is modeled in two cross-situational learning cases: (1) with joint attention (Attentional models), and (2) with joint attention and prosodic cues (Attentional-prosodic models). Attentional-prosodic models manifest superior performance to Attentional ones for the task of word-learning. The Attentional-prosodic DQN outperforms existing word-learning models for the same task.
van Borkulo, S.P.; van Joolingen, W.R.; Savelsbergh, E.R.; de Jong, T.
Computer modeling has been widely promoted as a means to attain higher order learning outcomes. Substantiating these benefits, however, has been problematic due to a lack of proper assessment tools. In this study, we compared computer modeling with expository instruction, using a tailored assessment
Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…
This paper proposes a computational modelling approach for investigating the interplay of learning and playing in serious games. A formal model is introduced that allows for studying the details of playing a serious game under diverse conditions. The dynamics of player action and motivation is based
Treur, J.; Lu et al., B.L.
In this paper a computational model is presented that models how dreaming is used to learn fear extinction. The approach addresses dreaming as internal simulation incorporating memory elements in the form of sensory representations and their associated fear. During dream episodes regulation of fear
Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.
Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…
Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.
Satoto Endar Nayono
Full Text Available One of the key competencies of graduates majoring in Civil Engineering and Planning Education, Faculty of Engineering, Yogyakarta State University (YSU is able to plan buildings. CAD courses aim to train students to be able to pour the planning concepts into the picture. One of the obstacles faced in the course are concepts and pictures that created by the students often do not correspond to the standards used in the field. This study aims to develop a model of project-based learning so that the students’ pictures are more in line with the actual conditions in the field. This study was carried out through the stages as follows: (1 Pre test, (2 Planning of learning, (3 Implementation of the learning model of project-based learning, (4 monitoring and evaluation (5 Reflection and revision, (6 Implementation of learning in the next cycle, and (7 Evaluation of the learning outcomes. This study was conducted for four months in 2012 in the Department of Civil Engineering and Planning Education, Faculty of Engineering, YSU. The subjects of this study are the students who took the course of Computer Aided Design. The analysis of the data used descriptive qualitative and descriptive statistics. The results of this study were: (1 The implementation of project based learning model was proven to increase the learning process and the learning outcomes of students in the subject of CAD through the provision of buildings planning pictures tasks of school buildings based on the real conditions in the field. The task was delivered in every meeting and improved based on the feedback from their lecturers, (2 the learning model of project based learning will be easier to be implemented if it is accompanied by the model of peer tutoring and the learning model of PAIKEM.
Fiorini, Rodolfo A.; Dacquino, Gianfranco
GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous
Li, Xiumin; Wang, Wei; Xue, Fangzheng; Song, Yongduan
Recently there has been continuously increasing interest in building up computational models of spiking neural networks (SNN), such as the Liquid State Machine (LSM). The biologically inspired self-organized neural networks with neural plasticity can enhance the capability of computational performance, with the characteristic features of dynamical memory and recurrent connection cycles which distinguish them from the more widely used feedforward neural networks. Despite a variety of computational models for brain-like learning and information processing have been proposed, the modeling of self-organized neural networks with multi-neural plasticity is still an important open challenge. The main difficulties lie in the interplay among different forms of neural plasticity rules and understanding how structures and dynamics of neural networks shape the computational performance. In this paper, we propose a novel approach to develop the models of LSM with a biologically inspired self-organizing network based on two neural plasticity learning rules. The connectivity among excitatory neurons is adapted by spike-timing-dependent plasticity (STDP) learning; meanwhile, the degrees of neuronal excitability are regulated to maintain a moderate average activity level by another learning rule: intrinsic plasticity (IP). Our study shows that LSM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone. The noticeable improvement with the proposed method is due to the better reflected competition among different neurons in the developed SNN model, as well as the more effectively encoded and processed relevant dynamic information with its learning and self-organizing mechanism. This result gives insights to the optimization of computational models of spiking neural networks with neural plasticity.
Raicu, A.; Raicu, G.
The paper presents the most important aspects of innovation in Engineering Education using Computer Assisted Learning. The authors propose to increase the quality of Engineering Education programs of study at European standards. The use of computer assisted learning methodologies in all studies is becoming an important resource in Higher Education. We intend to improve the concept of e-Learning using virtual terminals, online support and assisting special training through live seminars and interactive labs to develop a virtual university model. We intend to encourage computer assisted learning and innovation as sources of competitive advantage, to permit vision and learning analysis, identifies new sources of technology and ideas. Our work is based on our university datasets collected during last fifteen years using several e-Learning systems. In Constanta Maritime University (CMU), using eLearning and Knowledge Management Services (KMS) is very important and we apply it effectively to achieve strategic objectives, such as collaboration, sharing and good practice. We have experience in this field since 2000 year using Moodle as KMS in our university. The term KMS can be associated to Open Source Software, Open Standards, Open Protocols and Open Knowledge licenses, initiatives and policies. In CMU Virtual Campus we have today over 12500 active users. Another experience of the authors is the implementation of MariTrainer Wiki educational platform based on Dokeos and DekiWiki under MARICOMP and MEP Leonardo da Vinci Project. We'll also present in this paper a case study under EU funded project POSDRU, where the authors implemented other educational platform in Technological High Schools from Romania used over 1000 teachers. Based on large datasets the study tries to improve the concept of e-Learning teaching using the revolutionary technologies. The new concept present in this paper is that the teaching and learning will be interactive and live. The new and modern
Preece, Daniel; Williams, Sarah B; Lam, Richard; Weller, Renate
Three-dimensional (3D) information plays an important part in medical and veterinary education. Appreciating complex 3D spatial relationships requires a strong foundational understanding of anatomy and mental 3D visualization skills. Novel learning resources have been introduced to anatomy training to achieve this. Objective evaluation of their comparative efficacies remains scarce in the literature. This study developed and evaluated the use of a physical model in demonstrating the complex spatial relationships of the equine foot. It was hypothesized that the newly developed physical model would be more effective for students to learn magnetic resonance imaging (MRI) anatomy of the foot than textbooks or computer-based 3D models. Third year veterinary medicine students were randomly assigned to one of three teaching aid groups (physical model; textbooks; 3D computer model). The comparative efficacies of the three teaching aids were assessed through students' abilities to identify anatomical structures on MR images. Overall mean MRI assessment scores were significantly higher in students utilizing the physical model (86.39%) compared with students using textbooks (62.61%) and the 3D computer model (63.68%) (P < 0.001), with no significant difference between the textbook and 3D computer model groups (P = 0.685). Student feedback was also more positive in the physical model group compared with both the textbook and 3D computer model groups. Our results suggest that physical models may hold a significant advantage over alternative learning resources in enhancing visuospatial and 3D understanding of complex anatomical architecture, and that 3D computer models have significant limitations with regards to 3D learning. © 2013 American Association of Anatomists.
Sigala, Rodrigo; Haufe, Sebastian; Roy, Dipanjan; Dinse, Hubert R.; Ritter, Petra
During the past two decades growing evidence indicates that brain oscillations in the alpha band (~10 Hz) not only reflect an “idle” state of cortical activity, but also take a more active role in the generation of complex cognitive functions. A recent study shows that more than 60% of the observed inter-subject variability in perceptual learning can be ascribed to ongoing alpha activity. This evidence indicates a significant role of alpha oscillations for perceptual learning and hence motivates to explore the potential underlying mechanisms. Hence, it is the purpose of this review to highlight existent evidence that ascribes intrinsic alpha oscillations a role in shaping our ability to learn. In the review, we disentangle the alpha rhythm into different neural signatures that control information processing within individual functional building blocks of perceptual learning. We further highlight computational studies that shed light on potential mechanisms regarding how alpha oscillations may modulate information transfer and connectivity changes relevant for learning. To enable testing of those model based hypotheses, we emphasize the need for multidisciplinary approaches combining assessment of behavior and multi-scale neuronal activity, active modulation of ongoing brain states and computational modeling to reveal the mathematical principles of the complex neuronal interactions. In particular we highlight the relevance of multi-scale modeling frameworks such as the one currently being developed by “The Virtual Brain” project. PMID:24772077
Full Text Available During the past two decades growing evidence indicates that brain oscillations in the alpha band (~10 Hz not only reflect an ‘idle’ state of cortical activity, but also take a more active role in the generation of complex cognitive functions. A recent study shows that more than 60% of the observed inter-subject variability in perceptual learning can be ascribed to ongoing alpha activity. This evidence indicates a significant role of alpha oscillations for perceptual learning and hence motivates to explore the potential underlying mechanisms. Hence, it is the purpose of this review to highlight existent evidence that ascribes intrinsic alpha oscillations a role in shaping our ability to learn. In the review, we disentangle the alpha rhythm into different neural signatures that control information processing within individual functional building blocks of perceptual learning. We further highlight computational studies that shed light on potential mechanisms regarding how alpha oscillations may modulate information transfer and connectivity changes relevant for learning. To enable testing of those model based hypotheses, we emphasize the need for multidisciplinary approaches combining assessment of behavior and multi-scale neuronal activity, active modulation of ongoing brain states and computational modeling to reveal the mathematical principles of the complex neuronal interactions. In particular we highlight the relevance of multi-scale modeling frameworks such as the one currently being developed by The Virtual Brain project.
Larsen, Lasse Juel; Majgaard, Gunver
This abstract focuses on the computer game design process in the education of engineers at the university level. We present a model for understanding the different layers in the game design process, and an articulation of their intricate interconnectedness. Our motivation is propelled by our daily...... teaching practice of game design. We have observed a need for a design model that quickly can create an easily understandable overview over something as complex as the design processes of computer games. This posed a problem: how do we present a broad overview of the game design process and at the same...... time make sure that the students learn to act and reflect like game designers? We fell our game design model managed to just that end. Our model entails a guideline for the computer game design process in its entirety, and at same time distributes clear and easy understandable insight to a particular...
Han, Junwei; Chen, Changyuan; Shao, Ling; Hu, Xintao; Han, Jungong; Liu, Tianming
Generally, various visual media are unequally memorable by the human brain. This paper looks into a new direction of modeling the memorability of video clips and automatically predicting how memorable they are by learning from brain functional magnetic resonance imaging (fMRI). We propose a novel computational framework by integrating the power of low-level audiovisual features and brain activity decoding via fMRI. Initially, a user study experiment is performed to create a ground truth database for measuring video memorability and a set of effective low-level audiovisual features is examined in this database. Then, human subjects' brain fMRI data are obtained when they are watching the video clips. The fMRI-derived features that convey the brain activity of memorizing videos are extracted using a universal brain reference system. Finally, due to the fact that fMRI scanning is expensive and time-consuming, a computational model is learned on our benchmark dataset with the objective of maximizing the correlation between the low-level audiovisual features and the fMRI-derived features using joint subspace learning. The learned model can then automatically predict the memorability of videos without fMRI scans. Evaluations on publically available image and video databases demonstrate the effectiveness of the proposed framework.
Yelbay Yilmaz, Yasemin
This study aimed at devising a vocabulary learning software that would help learners learn and retain vocabulary items effectively. Foundation linguistics and learning theories have been adapted to the foreign language vocabulary learning context using a computer software named Parole that was designed exclusively for this study. Experimental…
Aboutalib, Sarah S.; Mohamed, Aly A.; Zuley, Margarita L.; Berg, Wendie A.; Luo, Yahong; Wu, Shandong
Digital mammography screening is an important exam for the early detection of breast cancer and reduction in mortality. False positives leading to high recall rates, however, results in unnecessary negative consequences to patients and health care systems. In order to better aid radiologists, computer-aided tools can be utilized to improve distinction between image classifications and thus potentially reduce false recalls. The emergence of deep learning has shown promising results in the area of biomedical imaging data analysis. This study aimed to investigate deep learning and transfer learning methods that can improve digital mammography classification performance. In particular, we evaluated the effect of pre-training deep learning models with other imaging datasets in order to boost classification performance on a digital mammography dataset. Two types of datasets were used for pre-training: (1) a digitized film mammography dataset, and (2) a very large non-medical imaging dataset. By using either of these datasets to pre-train the network initially, and then fine-tuning with the digital mammography dataset, we found an increase in overall classification performance in comparison to a model without pre-training, with the very large non-medical dataset performing the best in improving the classification accuracy.
Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar
Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.
Friedrich, J.; Karslioglu, M. O.
The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.
Bernstein, A. V.; Burnaev, E. V.
Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.
Erol, B.; Erol, S.
Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.
Goh, Garrett B. [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354; Hodas, Nathan O. [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354; Vishnu, Abhinav [Advanced Computing, Mathematics, and Data Division, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland Washington 99354
The rise and fall of artificial neural networks is well documented in the scientific literature of both the fields of computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on “deep” neural networks. Within the last few years, we have seen the transformative impact of deep learning the computer science domain, notably in speech recognition and computer vision, to the extent that the majority of practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties as compared to traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including QSAR, virtual screening, protein structure modeling, QM calculations, materials synthesis and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non neural networks state-of-the-art models across disparate research topics, and deep neural network based models often exceeded the “glass ceiling” expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a useful tool and may grow into a pivotal role for various challenges in the computational chemistry field.
Sanford, John F.; Naidu, Jaideep T.
The paper argues that mathematical modeling is the essence of computational thinking. Learning a computer language is a valuable assistance in learning logical thinking but of less assistance when learning problem-solving skills. The paper is third in a series and presents some examples of mathematical modeling using spreadsheets at an advanced…
Alresheed, Saleh; Leask, Marilyn; Raiker, Andrea
Computer-assisted language learning (CALL) technology and pedagogy have gained recognition globally for their success in supporting second language acquisition (SLA). In Saudi Arabia, the government aims to provide most educational institutions with computers and networking for integrating CALL into classrooms. However, the recognition of CALL's…
Baldi, P. [California Inst. of Tech., Pasadena, CA (United States)
This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. Computational tools are increasingly needed to process the massive amounts of data, to organize and classify sequences, to detect weak similarities, to separate coding from non-coding regions, and reconstruct the underlying evolutionary history. The fundamental problem in machine learning is the same as in scientific reasoning in general, as well as statistical modeling: to come up with a good model for the data. In this tutorial four classes of models are reviewed. They are: Hidden Markov models; artificial Neural Networks; Belief Networks; and Stochastic Grammars. When dealing with DNA and protein primary sequences, Hidden Markov models are one of the most flexible and powerful alignments and data base searches. In this tutorial, attention is focused on the theory of Hidden Markov Models, and how to apply them to problems in molecular biology.
Goh, Garrett B; Hodas, Nathan O; Vishnu, Abhinav
The rise and fall of artificial neural networks is well documented in the scientific literature of both computer science and computational chemistry. Yet almost two decades later, we are now seeing a resurgence of interest in deep learning, a machine learning algorithm based on multilayer neural networks. Within the last few years, we have seen the transformative impact of deep learning in many domains, particularly in speech recognition and computer vision, to the extent that the majority of expert practitioners in those field are now regularly eschewing prior established models in favor of deep learning models. In this review, we provide an introductory overview into the theory of deep neural networks and their unique properties that distinguish them from traditional machine learning algorithms used in cheminformatics. By providing an overview of the variety of emerging applications of deep neural networks, we highlight its ubiquity and broad applicability to a wide range of challenges in the field, including quantitative structure activity relationship, virtual screening, protein structure prediction, quantum chemistry, materials design, and property prediction. In reviewing the performance of deep neural networks, we observed a consistent outperformance against non-neural networks state-of-the-art models across disparate research topics, and deep neural network-based models often exceeded the "glass ceiling" expectations of their respective tasks. Coupled with the maturity of GPU-accelerated computing for training deep neural networks and the exponential growth of chemical data on which to train these networks on, we anticipate that deep learning algorithms will be a valuable tool for computational chemistry. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Development and Study the Usage of Blended Learning Environment Model Using Engineering Design Concept Learning Activities to Computer Programming Courses for Undergraduate Students of Rajabhat Universities
Full Text Available The objectives of this research were to study and Synthesise the components, to develop, and to study the usage of blended learning environment model using engineering design concept learning activities to computer programming courses for undergraduate students of Rajabhat universities. The research methodology was divided into 3 phases. Phase I: surveying presents, needs and problems in teaching computer programming of 52 lecturers by using in-depth interview from 5 experienced lecturers. The model’s elements were evaluated by 5 experts. The tools were questionnaire, interview form, and model’s elements assessment form. Phase II: developing the model of blended learning environment and learning activities based on engineering design processes and confirming model by 8 experts. The tools were the draft of learning environment, courseware, and assessment forms. Phase III evaluating the effects of using the implemented environment. The samples were students which formed into 2 groups, 25 people in the experiment group and 27 people in the control group by cluster random sampling. The tools were learning environment, courseware, and assessment tools. The statistics used in this research were means, standard deviation, t-test dependent, and one-way MANOVA. The results found that: 1 Lecturers quite agreed with the physical, mental, social, and information learning environment, learning processes, and assessments. There were all needs in high level. However there were physical environment problems in high level yet quite low in other aspects. 2 The developed learning environment had 4 components which were a 4 types of environments b the inputs included blended learning environment, learning motivation factors, and computer programming content c the processes were analysis of state objectives, design learning environment and activities, developing learning environment and testing materials, implement, ation evaluation and evaluate, 4 the outputs
Moon, Man-Ki; Jahng, Surng-Gahb; Kim, Tae-Yong
The aim of this research was to construct a motivational model which would stimulate voluntary and proactive learning using digital game methods offering players more freedom and control. The theoretical framework of this research lays the foundation for a pedagogical learning model based on digital games. We analyzed the game reward system, which…
Vivian V. Valentin
Full Text Available The evidence is now good that different memory systems mediate the learning of different types of category structures. In particular, declarative memory dominates rule-based (RB category learning and procedural memory dominates information-integration (II category learning. For example, several studies have reported that feedback timing is critical for II category learning, but not for RB category learning – results that have broad support within the memory systems literature. Specifically, II category learning has been shown to be best with feedback delays of 500ms compared to delays of 0 and 1000ms, and highly impaired with delays of 2.5 seconds or longer. In contrast, RB learning is unaffected by any feedback delay up to 10 seconds. We propose a neurobiologically detailed theory of procedural learning that is sensitive to different feedback delays. The theory assumes that procedural learning is mediated by plasticity at cortical-striatal synapses that are modified by dopamine-mediated reinforcement learning. The model captures the time-course of the biochemical events in the striatum that cause synaptic plasticity, and thereby accounts for the empirical effects of various feedback delays on II category learning.
Marek, Michael W.; Wu, Wen-Chi Vivian
This conceptual, interdisciplinary inquiry explores Complex Dynamic Systems as the concept relates to the internal and external environmental factors affecting computer assisted language learning (CALL). Based on the results obtained by de Rosnay ["World Futures: The Journal of General Evolution", 67(4/5), 304-315 (2011)], who observed…
Rapeepisarn, Kowit; Wong, Kok Wai; Fung, Chun Che; Khine, Myint Swe
When designing Educational Computer Games, designers usually consider target age, interactivity, interface and other related issues. They rarely explore the genres which should employ into one type of educational game. Recently, some digital game-based researchers made attempt to combine game genre with learning theory. Different researchers use…
Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin
Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381
Alonso, Fernando; Manrique, Daniel; Vines, Jose M.
This paper presents a novel instructional model for e-learning and an evaluation study to determine the effectiveness of this model for teaching Java language programming to information technology specialists working for the Spanish Public Administration. This is a general-purpose model that combines objectivist and constructivist learning…
Dickes, Amanda Catherine; Sengupta, Pratim
In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these…
Jin Joo eLee
Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.
Broekens, Douwe Joost
In this thesis we have studied the influence of emotion on learning. We have used computational modelling techniques to do so, more specifically, the reinforcement learning paradigm. Emotion is modelled as artificial affect, a measure that denotes the positiveness versus negativeness of a situation
cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the
Baston, Chiara; Ursino, Mauro
Basal Ganglia (BG) are implied in many motor and cognitive tasks, such as action selection, and have a central role in many pathologies, primarily Parkinson Disease. In the present work, we use a recently developed biologically inspired BG model to analyze how the dopamine (DA) level can affect the temporal response during action selection, and the capacity to learn new actions following rewards and punishments. The model incorporates the 3 main pathways (direct, indirect and hyperdirect) working in BG functioning. The behavior of 2 alternative networks (the first with normal DA levels, the second with reduced DA) is analyzed both in untrained conditions, and during training performed in different epochs. The results show that reduced DA causes delayed temporal responses in the untrained network, and difficult of learning during training, characterized by the necessity of much more epochs. The results provide interesting hints to understand the behavior of healthy and dopamine depleted subjects, such as parkinsonian patients.
Jarnawi Afgani Dahlan; Yaya Sukjaya Kusumah; Mr Heri Sutarno
The focus of this research is on the development of mathematics teaching and learning activity which is based on the application of computer software. The aim of research is as follows : 1) to identify some mathematics topics which feasible to be presented by computer-based e-learning, 2) design, develop, and implement computer-based e-learning on mathematics, and 3) analyze the impact of computer-based e-learning in the enhancement of SMA students’ high order mathematical thinking. All activ...
Shinohara, Ayumi; Miyano, Satoru
This paper considers computationai learning from the viewpoint of teaching. We introduce a notion of teachability with which we establish a relationship between the learnability and teachability. We also discuss the complexity issues of a teacher in relation to learning.
Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kasaie, Parastu; David Kelton, W; Ancona, Rachel M; Ward, Michael J; Froehle, Craig M; Lyons, Michael S
Computer simulation is a highly advantageous method for understanding and improving health care operations with a wide variety of possible applications. Most computer simulation studies in emergency medicine have sought to improve allocation of resources to meet demand or to assess the impact of hospital and other system policies on emergency department (ED) throughput. These models have enabled essential discoveries that can be used to improve the general structure and functioning of EDs. Theoretically, computer simulation could also be used to examine the impact of adding or modifying specific provider tasks. Doing so involves a number of unique considerations, particularly in the complex environment of acute care settings. In this paper, we describe conceptual advances and lessons learned during the design, parameterization, and validation of a computer simulation model constructed to evaluate changes in ED provider activity. We illustrate these concepts using examples from a study focused on the operational effects of HIV screening implementation in the ED. Presentation of our experience should emphasize the potential for application of computer simulation to study changes in health care provider activity and facilitate the progress of future investigators in this field. © 2017 by the Society for Academic Emergency Medicine.
Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)
José Manuel Freixo Nunes
Full Text Available Computational thinking can be thought of as an approach to problem solving which has been applied to different areas of learning and which has become an important field of investigation in the area of educational research. [continue
Whiteson, S.; Wiering, M.; van Otterlo, M.
Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,
José Manuel Freixo Nunes; Teresa Margarida Loureiro Cardoso
Computational thinking can be thought of as an approach to problem solving which has been applied to different areas of learning and which has become an important field of investigation in the area of educational research. [continue
Monthienvichienchai, Rachada; Sasse, M. Angela
This paper investigates how computer support for vicarious learning can be implemented by taking a principled approach to selecting and combining different media to capture educational dialogues. The main goal is to create vicarious learning materials of appropriate pedagogic content and production quality, and at the same time minimize the…
Ryan Thomas Philips
Full Text Available Cerebral vascular dynamics are generally thought to be controlled by neural activity in a unidirectional fashion. However, both computational modeling and experimental evidence point to the feedback effects of vascular dynamics on neural activity. Vascular feedback in the form of glucose and oxygen controls neuronal ATP, either directly or via the agency of astrocytes, which in turn modulates neural firing. Recently, a detailed model of the neuron-astrocyte-vessel system has shown how vasomotion can modulate neural firing. Similarly, arguing from known cerebrovascular physiology, an approach known as `hemoneural hypothesis' postulates functional modulation of neural activity by vascular feedback. To instantiate this perspective, we present a computational model in which a network of `vascular units' supplies energy to a neural network. The complex dynamics of the vascular network, modeled by a network of oscillators, turns neurons ON and OFF randomly. The informational consequence of such dynamics is explored in the context of an auto-encoder network. In the proposed model, each vascular unit supplies energy to a subset of hidden neurons of an autoencoder network, which constitutes its `projective field'. Neurons that receive adequate energy in a given trial have reduced threshold, and thus are prone to fire. Dynamics of the vascular network are governed by changes in the reconstruction error of the auto-encoder network, interpreted as the neuronal demand. Vascular feedback causes random inactivation of a subset of hidden neurons in every trial. We observe that, under conditions of desynchronized vascular dynamics, the output reconstruction error is low and the feature vectors learnt are sparse and independent. Our earlier modeling study highlighted the link between desynchronized vascular dynamics and efficient energy delivery in skeletal muscle. We now show that desynchronized vascular dynamics leads to efficient training in an auto
The report contains a description and summary of computer augmented learning devices and systems. The devices are of two general types programed instruction systems based on the teaching machines pioneered by Pressey and developed by Skinner, and the so-called "docile" systems that permit greater user-direction with the computer under student…
If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…
GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...
Teachers are using computer programs in conjunction with many classroom staples such as art supplies, math manipulatives, and science reference books. Twelve software programs and related activities are described which teach visual and auditory memory and spatial relations, as well as subject areas such as anatomy and geography. (MT)
Arnautova, Yelena A.; Vila, Jorge A.; Martin, Osvaldo A.; Scheraga, Harold A.
The room-temperature X-ray structures of two proteins, solved at 1.8 and 1.9 Å resolution, are used to investigate whether a set of conformations, rather than a single X-ray structure, provides better agreement with both the X-ray data and the observed 13 C α chemical shifts in solution. The room-temperature X-ray structures of ubiquitin and of the RNA-binding domain of nonstructural protein 1 of influenza A virus solved at 1.8 and 1.9 Å resolution, respectively, were used to investigate whether a set of conformations rather than a single X-ray structure provides better agreement with both the X-ray data and the observed 13 C α chemical shifts in solution. For this purpose, a set of new conformations for each of these proteins was generated by fitting them to the experimental X-ray data deposited in the PDB. For each of the generated structures, which show R and R free factors similar to those of the deposited X-ray structure, the 13 C α chemical shifts of all residues in the sequence were computed at the DFT level of theory. The sets of conformations were then evaluated by their ability to reproduce the observed 13 C α chemical shifts by using the conformational average root-mean-square-deviation (ca-r.m.s.d.). For ubiquitin, the computed set of conformations is a better representation of the observed 13 C α chemical shifts in terms of the ca-r.m.s.d. than a single X-ray-derived structure. However, for the RNA-binding domain of nonstructural protein 1 of influenza A virus, consideration of an ensemble of conformations does not improve the agreement with the observed 13 C α chemical shifts. Whether an ensemble of conformations rather than any single structure is a more accurate representation of a protein structure in the crystal as well as of the observed 13 C α chemical shifts is determined by the dispersion of coordinates, in terms of the all-atom r.m.s.d. among the generated models; these generated models satisfy the experimental X-ray data with
The field of computational learning theory arose out of the desire to for mally understand the process of learning. As potential applications to artificial intelligence became apparent, the new field grew rapidly. The learning of geo metric objects became a natural area of study. The possibility of using learning techniques to compensate for unsolvability provided an attraction for individ uals with an immediate need to solve such difficult problems. Researchers at the Center for Night Vision were interested in solving the problem of interpreting data produced by a variety of sensors. Current vision techniques, which have a strong geometric component, can be used to extract features. However, these techniques fall short of useful recognition of the sensed objects. One potential solution is to incorporate learning techniques into the geometric manipulation of sensor data. As a first step toward realizing such a solution, the Systems Research Center at the University of Maryland, in conjunction with the C...
Beatty, Ken; Nunan, David
The study reported here investigates collaborative learning at the computer. Ten pairs of students were presented with a series of comprehension questions about Mary Shelley's novel "Frankenstein or a Modern Prometheus" along with a CD-ROM, "Frankenstein Illuminated," containing the novel and a variety of source material. Five students worked with…
Wee, Loo Kang; Ning, Hwee Tiang
This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a…
.... With the goal of achieving robustness, our research at UCR is directed towards learning parameters, feedback, contexts, features, concepts, and strategies of IU algorithms for model-based object recognition...
Gao, Wei; Zhu, Linli
The gradient learning model has been raising great attention in view of its promising perspectives for applications in statistics, data dimensionality reducing, and other specific fields. In this paper, we raise a new gradient learning model for ontology similarity measuring and ontology mapping in multidividing setting. The sample error in this setting is given by virtue of the hypothesis space and the trick of ontology dividing operator. Finally, two experiments presented on plant and humanoid robotics field verify the efficiency of the new computation model for ontology similarity measure and ontology mapping applications in multidividing setting. PMID:25530752
Full Text Available The gradient learning model has been raising great attention in view of its promising perspectives for applications in statistics, data dimensionality reducing, and other specific fields. In this paper, we raise a new gradient learning model for ontology similarity measuring and ontology mapping in multidividing setting. The sample error in this setting is given by virtue of the hypothesis space and the trick of ontology dividing operator. Finally, two experiments presented on plant and humanoid robotics field verify the efficiency of the new computation model for ontology similarity measure and ontology mapping applications in multidividing setting.
Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol
Full Text Available The research is aimed to assist and facilitate the students of Electrical and Electronics Department of Politeknik Negeri Jakarta (State Polytechnics of Jakarta, Indonesia, in learning technical English vocabulary. As technical students, they study ESP (English for Specific Purposes and they find some obstacles in memorizing technical vocabularies which are very important in order to read and understand manual books for laboratory and workshop. Some English technical vocabularies among others are “generate”, “pile”, “bench”, et cetera. The research outcome is software which will be beneficial for technical students, especially electrical and electronics students. This software can be used to practice their vocabulary skills, so they will be more skillful and knowledgeable. This software is designed by using the program of Rapid E-Learning Suite Version 5.2 and Flash CS3. The software practice contains some exercises on reading text and reading comprehension questions and presented with the multiple answers. This software is handy and flexible because students can bring it anywhere and be studied anytime. It is handy because this software is put and saved in CD (compact disc, so the students can take it with them anywhere and anytime they want to learn. In other words, they have flexibility to learn and practice English Technical Vocabularies. As a result, the students are found one of the ways to overcome their problems of memorizing vocabularies. The product is a kind of software which is easily used and portable so that the students can use the software anywhere and anytime. It consists of 3 (three sections of exercises. At the end of each exercise, the students are evaluated automatically by looking at the scoring system. These will encourage them to get good score by repeating it again and again. So the technical words are not problem for them. Furthermore, the students can practice technical English vocabulary both at home and
The last few years have witnessed fast development on dictionary learning approaches for a set of visual computing tasks, largely due to their utilization in developing new techniques based on sparse representation. Compared with conventional techniques employing manually defined dictionaries, such as Fourier Transform and Wavelet Transform, dictionary learning aims at obtaining a dictionary adaptively from the data so as to support optimal sparse representation of the data. In contrast to conventional clustering algorithms like K-means, where a data point is associated with only one cluster c
Borja, Ronaldo I
There have been many excellent books written on the subject of plastic deformation in solids, but rarely can one find a textbook on this subject. “Plasticity Modeling & Computation” is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids. It adopts a simple narrative style that is not mathematically overbearing, and has been written to emulate a professor giving a lecture on this subject inside a classroom. Each section is written to provide a balance between the relevant equations and the explanations behind them. Where relevant, sections end with one or more exercises designed to reinforce the understanding of the “lecture.” Color figures enhance the presentation and make the book very pleasant to read. For professors planning to use this textbook for their classes, the contents are sufficient for Parts A and B that can be taught in sequence over a period of two semesters or quarters.
Park, Jooyoung; Inoue, Atsushi
As users or consumers are now demanding smarter devices, intelligent systems are revolutionizing by utilizing machine learning. Machine learning as part of intelligent systems is already one of the most critical components in everyday tools ranging from search engines and credit card fraud detection to stock market analysis. You can train machines to perform some things, so that they can automatically detect, diagnose, and solve a variety of problems. The intelligent systems have made rapid progress in developing the state of the art in machine learning based on smart and deep perception. Using machine learning, the intelligent systems make widely applications in automated speech recognition, natural language processing, medical diagnosis, bioinformatics, and robot locomotion. This book aims at introducing how to treat a substantial amount of data, to teach machines and to improve decision making models. And this book specializes in the developments of advanced intelligent systems through machine learning. It...
Active Learning with Statistical Models ASC-9217041, NSF CDA-9309300 6. AUTHOR(S) David A. Cohn, Zoubin Ghahramani, and Michael I. Jordan 7. PERFORMING...TERMS 15. NUMBER OF PAGES Al, MIT, Artificial Intelligence, active learning , queries, locally weighted 6 regression, LOESS, mixtures of gaussians...COMPUTATIONAL LEARNING DEPARTMENT OF BRAIN AND COGNITIVE SCIENCES A.I. Memo No. 1522 January 9. 1995 C.B.C.L. Paper No. 110 Active Learning with
Liaw, S T; Marty, J J
To develop and evaluate a strategy to teach skills and issues associated with computers in the consultation. An overview lecture plus a workshop before and a workshop after practice placements, during the 10-week general practice (GP) term in the 5th year of the University of Melbourne medical course. Pre- and post-intervention study using a mix of qualitative and quantitative methods within a strategic evaluation framework. Self-reported attitudes and skills with clinical applications before, during and after the intervention. Most students had significant general computer experience but little in the medical area. They found the workshops relevant, interesting and easy to follow. The role-play approach facilitated students' learning of relevant communication and consulting skills and an appreciation of issues associated with using the information technology tools in simulated clinical situations to augment and complement their consulting skills. The workshops and exposure to GP systems were associated with an increase in the use of clinical software, more realistic expectations of existing clinical and medical record software and an understanding of the barriers to the use of computers in the consultation. The educational intervention assisted students to develop and express an understanding of the importance of consulting and communication skills in teaching and learning about medical informatics tools, hardware and software design, workplace issues and the impact of clinical computer systems on the consultation and patient care.
Dosher, Barbara; Lu, Zhong-Lin
Visual perceptual learning through practice or training can significantly improve performance on visual tasks. Originally seen as a manifestation of plasticity in the primary visual cortex, perceptual learning is more readily understood as improvements in the function of brain networks that integrate processes, including sensory representations, decision, attention, and reward, and balance plasticity with system stability. This review considers the primary phenomena of perceptual learning, theories of perceptual learning, and perceptual learning's effect on signal and noise in visual processing and decision. Models, especially computational models, play a key role in behavioral and physiological investigations of the mechanisms of perceptual learning and for understanding, predicting, and optimizing human perceptual processes, learning, and performance. Performance improvements resulting from reweighting or readout of sensory inputs to decision provide a strong theoretical framework for interpreting perceptual learning and transfer that may prove useful in optimizing learning in real-world applications.
Juan Gonzálo Restrepo Salazar
Full Text Available Computer programs are being used for teaching and learning of pharmacology and physiology at the University of Antioquia, in Medellín, Colombia. They should be more widely used since they offer clear advantages over traditional systems of teach.ng; they allow direct presentations of models in motion, as well as a more active, interesting and flexible way of learning; besides they can save time and cut costs. Los Modelos de Simulación por Computador son programas de aprendizaje para enseñar materias como farmacología y fisiología a los estudiantes del área de la salud y de las ciencias básicas biomédicas. Los experimentos de simulación pueden usarse como soporte para la docencia y en algunas circunstancias como alternativa en las prácticas de laboratorio. La tecnología por computador ahora disponible permite la presentación directa de modelos en movimiento y posibilita un aprendizaje menos pasivo, más eficiente e interesante. La respuesta simulada de los tejidos se genera ya sea por resultados de experimentos actuales o por modelos predictivos y se presenta en la pantalla con gráficas de alta resolución comparables con las situaciones reales. Los estudiantes pueden realizar experimentos simulados, cambiar fácilmente sus parámetros y obtener información de igual manera que si hubieran realizado el experimento en el laboratorio.
Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.
Baihui Yan; Qiao Zhou
In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...
The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community
Hosam Farouk El-Sofany
Full Text Available Cloud computing is a new computing model which is based on the grid computing, distributed computing, parallel computing and virtualization technologies define the shape of a new technology. It is the core technology of the next generation of network computing platform, especially in the field of education, cloud computing is the basic environment and platform of the future E-learning. It provides secure data storage, convenient internet services and strong computing power. This article mainly focuses on the research of the application of cloud computing in E-learning environment. The research study shows that the cloud platform is valued for both students and instructors to achieve the course objective. The paper presents the nature, benefits and cloud computing services, as a platform for e-learning environment.
Fellous, J M; Linster, C
Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.
Buscema, Massimo; Sacco, Pier Luigi
We propose an alternative approach to "deep" learning that is based on computational ecologies of structurally diverse artificial neural networks, and on dynamic associative memory responses to stimuli. Rather than focusing on massive computation of many different examples of a single situation, we opt for model-based learning and adaptive flexibility. Cross-fertilization of learning processes across multiple domains is the fundamental feature of human intelligence that must inform "new" artificial intelligence.
The Effects of Computer-Assisted Instruction Designed According to 7E Model of Constructivist Learning on Physics Student Teachers' Achievement, Concept Learning, Self-Efficacy Perceptions and Attitudes
Kocakaya, Serhat; Gonen, Selahattin
The purpose of this study was to investigate the effects of a Computer-Assisted Instruction designed according to 7E model of constructivist learning(CAI7E) related to "electrostatic'' topic on physics student teachers' cognitive development, misconceptions, self-efficacy perceptions and attitudes. The study was conducted in 2006-2007…
van der Maaten, L.J.P.; Boon, P.; Lange, G.; Paijmans, J.J.; Postma, E.
Until now, computer vision and machine learning techniques barely contributed to the archaeological domain. The use of these techniques can support archaeologists in their assessment and classification of archaeological finds. The paper illustrates the use of computer vision techniques for
Caplan, B.; Morrison, A.; Moore, J. C.; Berkowitz, A. R.
Understanding water is central to understanding environmental challenges. Scientists use `big data' and computational models to develop knowledge about the structure and function of complex systems, and to make predictions about changes in climate, weather, hydrology, and ecology. Large environmental systems-related data sets and simulation models are difficult for high school teachers and students to access and make sense of. Comp Hydro, a collaboration across four states and multiple school districts, integrates computational thinking and data-related science practices into water systems instruction to enhance development of scientific model-based reasoning, through curriculum, assessment and teacher professional development. Comp Hydro addresses the need for 1) teaching materials for using data and physical models of hydrological phenomena, 2) building teachers' and students' comfort or familiarity with data analysis and modeling, and 3) infusing the computational knowledge and practices necessary to model and visualize hydrologic processes into instruction. Comp Hydro teams in Baltimore, MD and Fort Collins, CO are integrating teaching about surface water systems into high school courses focusing on flooding (MD) and surface water reservoirs (CO). This interactive session will highlight the successes and challenges of our physical and simulation models in helping teachers and students develop proficiency with computational thinking about surface water. We also will share insights from comparing teacher-led vs. project-led development of curriculum and our simulations.
Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.
The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.
Petraska, K. E.; McCabe, J. D.
This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.
Neves, R.; Neves, M. L.; Teodoro, V.
Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.
Full Text Available The article presents the model of the process of implementation of the design of a cloud-oriented learning environment (CBLE for the preparation of bachelor of computer science, which consists of seven stages: analysis, setting goals and objectives, formulating requirements for the cloud-oriented learning environment, modeling the CBLE, developing CBLE, using CBLE in the educational Bachelor of Computer Science and Performance Testing. Each stage contains sub-steps. The analysis stage is considered in three aspects: psychological, pedagogical and technological. The formulation of the requirements for the CBLE was carried out taking into account the content and objectives of the training; experience of using CBLE; the personal qualities and knowledge, skills and abilities of students. The simulation phase was divided into sub-stages: the development of a structural and functional model of the CBLE for the preparation of bachelors of computer science; development of a model of cloud-oriented learning support system (COLSS; development of a model of interaction processes in CBLE. The fifth stage was also divided into the following sub-steps: domain registration and customization of the appearance of COLSS; definition of the disciplines provided by the curriculum preparation of bachelors of computer science; creation of own cabinets of teachers and students; download educational and methodological and accompanying materials; the choice of traditional and cloud-oriented forms, methods, means of training. The verification of the functioning of the CBLE will be carried out in the following areas: the functioning of the CBLE; results of students' educational activity; formation of information and communication competence of students.
Full Text Available Didactics of autonomous learning changes under the influence of new technologies. Computer technology can cover all the functions that a teacher develops in personal contact with the learner. People organizing distance learning must realize all the possibilities offered by computers. Computers can take over and also combine the functions of many tools and systems, e. g. type writer, video, telephone. This the contents can be offered in form of classic media by means of text, speech, picture, etc. Computers take over data processing and function as study materials. Computer included in a computer network can also function as a medium for interactive communication.
Pronskikh, V. S. [Fermilab
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes
Evaluating the nature and extent of the influence of Computer Assisted Language Learning (CALL) on the quality of language learning is highly problematic. This is owing to the number and complexity of interacting variables involved in setting the items for teaching and learning languages. This paper identified and ...
Strijbos, J. -W.
Within the (Computer-Supported) Collaborative Learning (CS)CL research community, there has been an extensive dialogue on theories and perspectives on learning from collaboration, approaches to scaffold (script) the collaborative process, and most recently research methodology. In contrast, the issue of assessment of collaborative learning has…
He, Wu; Cernusca, Dan; Abdous, M'hammed
The use of distance courses in learning is growing exponentially. To better support faculty and students for teaching and learning, distance learning programs need to constantly innovate and optimize their IT infrastructures. The new IT paradigm called "cloud computing" has the potential to transform the way that IT resources are utilized and…
Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M
The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.
Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel
This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.
Chariker, Julia H.
A longitudinal experiment was conducted to explore computer-based learning of neuroanatomy. Using a realistic 3D graphical model of neuroanatomy, and sections derived from the model, exploratory graphical tools were integrated into interactive computer programs so as to allow adaptive exploration. 72 participants learned either sectional anatomy alone or learned whole anatomy followed by sectional anatomy. Sectional anatomy was explored either in perceptually continuous animation or discretely, as in the use of an anatomical atlas. Learning was measured longitudinally to a high performance criterion. After learning, transfer to biomedical images and long-term retention was tested. Learning whole anatomy prior to learning sectional anatomy led to a more efficient learning experience. Learners demonstrated high levels of transfer from whole anatomy to sectional anatomy and from sectional anatomy to complex biomedical images. All learning groups demonstrated high levels of retention at 2--3 weeks.
Full Text Available This paper presents a computing hosting system to provide virtual computing laboratories for learning activities. This system is based on hosting and virtualization technologies. All the components used in its development are free software tools. The computing lab model provided by the system is a more sustainable and scalable alternative than the traditional academic computing lab, and it requires lower costs of installation and operation.
Yang, Jianfeng; Shu, Hua; McCandliss, Bruce D.; Zevin, Jason D.
Learning to read any language requires learning to map among print, sound and meaning. Writing systems differ in a number of factors that influence both the ease and rate with which reading skill can be acquired, as well as the eventual division of labor between phonological and semantic processes. Further, developmental reading disability manifests differently across writing systems, and may be related to different deficits in constitutive processes. Here we simulate some aspects of reading acquisition in Chinese and English using the same model architecture for both writing systems. The contribution of semantic and phonological processing to literacy acquisition in the two languages is simulated, including specific effects of phonological and semantic deficits. Further, we demonstrate that similar patterns of performance are observed when the same model is trained on both Chinese and English as an "early bilingual." The results are consistent with the view that reading skill is acquired by the application of statistical learning rules to mappings among print, sound and meaning, and that differences in the typical and disordered acquisition of reading skill between writing systems are driven by differences in the statistical patterns of the writing systems themselves, rather than differences in cognitive architecture of the learner. PMID:24587693
Braithwaite, David W.; Pyke, Aryn A.; Siegler, Robert S.
Many children fail to master fraction arithmetic even after years of instruction, a failure that hinders their learning of more advanced mathematics as well as their occupational success. To test hypotheses about why children have so many difficulties in this area, we created a computational model of fraction arithmetic learning and presented it…
Duo, Sun; Song, Lu Xue
In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.
This paper describes the present practical use of computers in two large beginning physics courses at the University of California, Irvine; discusses the versatility and desirability of computers in the field of education; and projects the possible future directions of computer-based learning. The advantages and disadvantages of educational…
Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Chaos Modelling with Computers Unpredicatable Behaviour of Deterministic Systems. Balakrishnan Ramasamy T S K V Iyer. General Article Volume 1 Issue 5 May 1996 pp 29-39 ...
Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…
David Roy Warriner
Full Text Available Abstract Background This study combined themes in cardiovascular modelling, clinical cardiology and e-learning to create an on-line environment that would assist undergraduate medical students in understanding key physiological and pathophysiological processes in the cardiovascular system. Methods An interactive on-line environment was developed incorporating a lumped-parameter mathematical model of the human cardiovascular system. The model outputs were used to characterise the progression of key disease processes and allowed students to classify disease severity with the aim of improving their understanding of abnormal physiology in a clinical context. Access to the on-line environment was offered to students at all stages of undergraduate training as an adjunct to routine lectures and tutorials in cardiac pathophysiology. Student feedback was collected on this novel on-line material in the course of routine audits of teaching delivery. Results Medical students, irrespective of their stage of undergraduate training, reported that they found the models and the environment interesting and a positive experience. After exposure to the environment, there was a statistically significant improvement in student performance on a series of 6 questions based on cardiovascular medicine, with a 33% and 22% increase in the number of questions answered correctly, p < 0.0001 and p < 0.001 respectively. Conclusions Considerable improvement was found in students’ knowledge and understanding during assessment after exposure to the e-learning environment. Opportunities exist for development of similar environments in other fields of medicine, refinement of the existing environment and further engagement with student cohorts. This work combines some exciting and developing fields in medical education, but routine adoption of these types of tool will be possible only with the engagement of all stake-holders, from educationalists, clinicians, modellers to
Warriner, David Roy; Bayley, Martin; Shi, Yubing; Lawford, Patricia Victoria; Narracott, Andrew; Fenner, John
This study combined themes in cardiovascular modelling, clinical cardiology and e-learning to create an on-line environment that would assist undergraduate medical students in understanding key physiological and pathophysiological processes in the cardiovascular system. An interactive on-line environment was developed incorporating a lumped-parameter mathematical model of the human cardiovascular system. The model outputs were used to characterise the progression of key disease processes and allowed students to classify disease severity with the aim of improving their understanding of abnormal physiology in a clinical context. Access to the on-line environment was offered to students at all stages of undergraduate training as an adjunct to routine lectures and tutorials in cardiac pathophysiology. Student feedback was collected on this novel on-line material in the course of routine audits of teaching delivery. Medical students, irrespective of their stage of undergraduate training, reported that they found the models and the environment interesting and a positive experience. After exposure to the environment, there was a statistically significant improvement in student performance on a series of 6 questions based on cardiovascular medicine, with a 33% and 22% increase in the number of questions answered correctly, p < 0.0001 and p < 0.001 respectively. Considerable improvement was found in students' knowledge and understanding during assessment after exposure to the e-learning environment. Opportunities exist for development of similar environments in other fields of medicine, refinement of the existing environment and further engagement with student cohorts. This work combines some exciting and developing fields in medical education, but routine adoption of these types of tool will be possible only with the engagement of all stake-holders, from educationalists, clinicians, modellers to, most importantly, medical students.
Huber, Leonard N.
Discusses Piaget's pre-operational, concrete operational, and formal operational stages and shows how this information sheds light on how children approach computers and computing, particularly with the LOGO programming language. (JN)
Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Moses, Alan M.; Zhang, Zhaolei
The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.
The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.
combination with other factors which may enhance or ameliorate the ... form of computer-based learning which carries two important features: .... To take some commonplace examples, a ... photographs, and even full-motion video clips.
Marco Antonio Moreira
Full Text Available The purpose of the present paper is to present a theoretical framework to promote and assist meaningful physics learning through computational models. Our proposal is based on the use of a tool, the AVM diagram, to design educational activities involving modeling and computer simulations. The idea is to provide a starting point for the construction and implementation of didactical approaches grounded in a coherent epistemological view about scientific modeling.
Chamon, C; Mucciolo, E R; Ruckenstein, A E; Yang, Z-C
Mappings of classical computation onto statistical mechanics models have led to remarkable successes in addressing some complex computational problems. However, such mappings display thermodynamic phase transitions that may prevent reaching solution even for easy problems known to be solvable in polynomial time. Here we map universal reversible classical computations onto a planar vertex model that exhibits no bulk classical thermodynamic phase transition, independent of the computational circuit. Within our approach the solution of the computation is encoded in the ground state of the vertex model and its complexity is reflected in the dynamics of the relaxation of the system to its ground state. We use thermal annealing with and without 'learning' to explore typical computational problems. We also construct a mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating an approach to reversible classical computation based on state-of-the-art implementations of quantum annealing.
Traffic models in computer networks can be described as a complicated system. These systems show non-linear features and to simulate behaviours of these systems are also difficult. Before implementing network equipments users wants to know capability of their computer network. They do not want the servers to be overloaded during temporary traffic peaks when more requests arrive than the server is designed for. As a starting point for our study a non-linear system model of network traffic is established to exam behaviour of the network planned. The paper presents setting up a non-linear simulation model that helps us to observe dataflow problems of the networks. This simple model captures the relationship between the competing traffic and the input and output dataflow. In this paper, we also focus on measuring the bottleneck of the network, which was defined as the difference between the link capacity and the competing traffic volume on the link that limits end-to-end throughput. We validate the model using measurements on a working network. The results show that the initial model estimates well main behaviours and critical parameters of the network. Based on this study, we propose to develop a new algorithm, which experimentally determines and predict the available parameters of the network modelled.
Wei, Kunlin; Yan, Xiang; Kong, Gaiqing; Yin, Cong; Zhang, Fan; Wang, Qining; Kording, Konrad Paul
Over the past few decades, one of the most salient lifestyle changes for us has been the use of computers. For many of us, manual interaction with a computer occupies a large portion of our working time. Through neural plasticity, this extensive movement training should change our representation of movements (e.g., [1-3]), just like search engines affect memory . However, how computer use affects motor learning is largely understudied. Additionally, as virtually all participants in studies of perception and actions are computer users, a legitimate question is whether insights from these studies bear the signature of computer-use experience. We compared non-computer users with age- and education-matched computer users in standard motor learning experiments. We found that people learned equally fast but that non-computer users generalized significantly less across space, a difference negated by two weeks of intensive computer training. Our findings suggest that computer-use experience shaped our basic sensorimotor behaviors, and this influence should be considered whenever computer users are recruited as study participants. Copyright © 2014 Elsevier Ltd. All rights reserved.
Yu, Shimeng; Gao, Bin; Fang, Zheng; Yu, Hongyu; Kang, Jinfeng; Wong, H-S Philip
Hardware implementation of neuromorphic computing is attractive as a computing paradigm beyond the conventional digital computing. In this work, we show that the SET (off-to-on) transition of metal oxide resistive switching memory becomes probabilistic under a weak programming condition. The switching variability of the binary synaptic device implements a stochastic learning rule. Such stochastic SET transition was statistically measured and modeled for a simulation of a winner-take-all network for competitive learning. The simulation illustrates that with such stochastic learning, the orientation classification function of input patterns can be effectively realized. The system performance metrics were compared between the conventional approach using the analog synapse and the approach in this work that employs the binary synapse utilizing the stochastic learning. The feasibility of using binary synapse in the neurormorphic computing may relax the constraints to engineer continuous multilevel intermediate states and widens the material choice for the synaptic device design.
de Jong, Anthonius J.M.
The present volume presents the results of an inventory of elements of such a computer learning environment. This inventory was conducted within a DELTA project called SIMULATE. In the project a learning environment that provides intelligent support to learners and that has a simulation as its
Anglim, Jeromy; Wynton, Sarah K. A.
The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking…
Kristoffer Carl Aberg
Full Text Available Learning how to gain rewards (approach learning and avoid punishments (avoidance learning is fundamental for everyday life. While individual differences in approach and avoidance learning styles have been related to genetics and aging, the contribution of personality factors, such as traits, remains undetermined. Moreover, little is known about the computational mechanisms mediating differences in learning styles. Here, we used a probabilistic selection task with positive and negative feedbacks, in combination with computational modelling, to show that individuals displaying better approach (vs. avoidance learning scored higher on measures of approach (vs. avoidance trait motivation, but, paradoxically, also displayed reduced learning speed following positive (vs. negative outcomes. These data suggest that learning different types of information depend on associated reward values and internal motivational drives, possibly determined by personality traits.
Carl Aberg, Kristoffer; Doell, Kimberly C.; Schwartz, Sophie
Learning how to gain rewards (approach learning) and avoid punishments (avoidance learning) is fundamental for everyday life. While individual differences in approach and avoidance learning styles have been related to genetics and aging, the contribution of personality factors, such as traits, remains undetermined. Moreover, little is known about the computational mechanisms mediating differences in learning styles. Here, we used a probabilistic selection task with positive and negative feedbacks, in combination with computational modelling, to show that individuals displaying better approach (vs. avoidance) learning scored higher on measures of approach (vs. avoidance) trait motivation, but, paradoxically, also displayed reduced learning speed following positive (vs. negative) outcomes. These data suggest that learning different types of information depend on associated reward values and internal motivational drives, possibly determined by personality traits. PMID:27851807
Baldassarre, Gianluca; Mannella, Francesco; Fiore, Vincenzo G; Redgrave, Peter; Gurney, Kevin; Mirolli, Marco
Reinforcement (trial-and-error) learning in animals is driven by a multitude of processes. Most animals have evolved several sophisticated systems of 'extrinsic motivations' (EMs) that guide them to acquire behaviours allowing them to maintain their bodies, defend against threat, and reproduce. Animals have also evolved various systems of 'intrinsic motivations' (IMs) that allow them to acquire actions in the absence of extrinsic rewards. These actions are used later to pursue such rewards when they become available. Intrinsic motivations have been studied in Psychology for many decades and their biological substrates are now being elucidated by neuroscientists. In the last two decades, investigators in computational modelling, robotics and machine learning have proposed various mechanisms that capture certain aspects of IMs. However, we still lack models of IMs that attempt to integrate all key aspects of intrinsically motivated learning and behaviour while taking into account the relevant neurobiological constraints. This paper proposes a bio-constrained system-level model that contributes a major step towards this integration. The model focusses on three processes related to IMs and on the neural mechanisms underlying them: (a) the acquisition of action-outcome associations (internal models of the agent-environment interaction) driven by phasic dopamine signals caused by sudden, unexpected changes in the environment; (b) the transient focussing of visual gaze and actions on salient portions of the environment; (c) the subsequent recall of actions to pursue extrinsic rewards based on goal-directed reactivation of the representations of their outcomes. The tests of the model, including a series of selective lesions, show how the focussing processes lead to a faster learning of action-outcome associations, and how these associations can be recruited for accomplishing goal-directed behaviours. The model, together with the background knowledge reviewed in the paper
Schmitz, Birgit; Czauderna, André; Klemke, Roland; Specht, Marcus
Schmitz, B., Czauderna, A., Klemke, R., & Specht, M. (2011). Game based learning for computer science education. In G. van der Veer, P. B. Sloep, & M. van Eekelen (Eds.), Computer Science Education Research Conference (CSERC '11) (pp. 81-86). Heerlen, The Netherlands: Open Universiteit.
The personal computer is sparking a major historical change in the way people learn, a change that could lead to the disappearance of formal education as we know it. The computer can help resolve many of the difficulties now crippling education by enabling expert teachers and curriculum developers to prepare interactive and individualized…
Professionals responsible for the delivery of education and training using technology systems and platforms can facilitate complex learning through application of relevant strategies, principles and theories that support how learners learn and that support how curriculum should be designed in a technology based learning environment. Technological…
Frank, M; Pacheco, Andreu
This document is a first attempt to describe the LHCb computing model. The CPU power needed to process data for the event filter and reconstruction is estimated to be 2.2 \\Theta 106 MIPS. This will be installed at the experiment and will be reused during non data-taking periods for reprocessing. The maximal I/O of these activities is estimated to be around 40 MB/s.We have studied three basic models concerning the placement of the CPU resources for the other computing activities, Monte Carlo-simulation (1:4 \\Theta 106 MIPS) and physics analysis (0:5 \\Theta 106 MIPS): CPU resources may either be located at the physicist's homelab, national computer centres (Regional Centres) or at CERN.The CPU resources foreseen for analysis are sufficient to allow 100 concurrent analyses. It is assumed that physicists will work in physics groups that produce analysis data at an average rate of 4.2 MB/s or 11 TB per month. However, producing these group analysis data requires reading capabilities of 660 MB/s. It is further assu...
Jeffrey D Fitzgerald
Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.
Kopper, Claudio, E-mail: firstname.lastname@example.org [NIKHEF, Science Park 105, 1098 XG Amsterdam (Netherlands)
Completed in 2008, Antares is now the largest water Cherenkov neutrino telescope in the Northern Hemisphere. Its main goal is to detect neutrinos from galactic and extra-galactic sources. Due to the high background rate of atmospheric muons and the high level of bioluminescence, several on-line and off-line filtering algorithms have to be applied to the raw data taken by the instrument. To be able to handle this data stream, a dedicated computing infrastructure has been set up. The paper covers the main aspects of the current official Antares computing model. This includes an overview of on-line and off-line data handling and storage. In addition, the current usage of the “IceTray” software framework for Antares data processing is highlighted. Finally, an overview of the data storage formats used for high-level analysis is given.
Washington Luna Encalada
Full Text Available In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs, and bring your own device (BYOD are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the reproduction of the benefits of an educational institution’s physical laboratory. For a university without a computing lab, to obtain hands-on IT training with software, operating systems, networks, servers, storage, and cloud computing similar to that which could be received on a university campus computing lab, it is necessary to use a combination of technological tools. Such teaching tools must promote the transmission of knowledge, encourage interaction and collaboration, and ensure students obtain valuable hands-on experience. That, in turn, allows the universities to focus more on teaching and research activities than on the implementation and configuration of complex physical systems. In this article, we present a model for implementing ecosystems which allow universities to teach practical Information Technology (IT skills. The model utilizes what is called a “social cloud”, which utilizes all cloud computing services, such as Software as a Service (SaaS, Platform as a Service (PaaS, and Infrastructure as a Service (IaaS. Additionally, it integrates the cloud learning aspects of a MOOC and several aspects of social networking and support. Social clouds have striking benefits such as centrality, ease of use, scalability, and ubiquity, providing a superior learning environment when compared to that of a simple physical lab. The proposed model allows students to foster all the educational pillars such as learning to know, learning to be, learning
Full Text Available Our ability to learn from the outcomes of our actions and to adapt our decisions accordingly changes over the course of the human lifespan. In recent years, there has been an increasing interest in using computational models to understand developmental changes in learning and decision-making. Moreover, extensions of these models are currently applied to study socio-emotional influences on learning in different age groups, a topic that is of great relevance for applications in education and health psychology. In this article, we aim to provide an introduction to basic ideas underlying computational models of reinforcement learning and focus on parameters and model variants that might be of interest to developmental scientists. We then highlight recent attempts to use reinforcement learning models to study the influence of social information on learning across development. The aim of this review is to illustrate how computational models can be applied in developmental science, what they can add to our understanding of developmental mechanisms and how they can be used to bridge the gap between psychological and neurobiological theories of development.
Ian S Howard
Full Text Available Words are made up of speech sounds. Almost all accounts of child speech development assume that children learn the pronunciation of first language (L1 speech sounds by imitation, most claiming that the child performs some kind of auditory matching to the elements of ambient speech. However, there is evidence to support an alternative account and we investigate the non-imitative child behavior and well-attested caregiver behavior that this account posits using Elija, a computational model of an infant. Through unsupervised active learning, Elija began by discovering motor patterns, which produced sounds. In separate interaction experiments, native speakers of English, French and German then played the role of his caregiver. In their first interactions with Elija, they were allowed to respond to his sounds if they felt this was natural. We analyzed the interactions through phonemic transcriptions of the caregivers' utterances and found that they interpreted his output within the framework of their native languages. Their form of response was almost always a reformulation of Elija's utterance into well-formed sounds of L1. Elija retained those motor patterns to which a caregiver responded and formed associations between his motor pattern and the response it provoked. Thus in a second phase of interaction, he was able to parse input utterances in terms of the caregiver responses he had heard previously, and respond using his associated motor patterns. This capacity enabled the caregivers to teach Elija to pronounce some simple words in their native languages, by his serial imitation of the words' component speech sounds. Overall, our results demonstrate that the natural responses and behaviors of human subjects to infant-like vocalizations can take a computational model from a biologically plausible initial state through to word pronunciation. This provides support for an alternative to current auditory matching hypotheses for how children learn to
Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik
Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.
Full Text Available Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.
This paper explores a Radical Collaborative Approach in the global and centralized Rock-Art Database project to find new ways to look at rock-art by making information more accessible and more visible through public contributions. It looks at rock-art through the Key Performance Indicator (KPI), identified with the latest Australian State of the Environment Reports to help develop a better understanding of rock-art within a broader Cultural and Indigenous Heritage context. Using a practice-led approach the project develops a conceptual collaborative model that is deployed within the RADB Management System. Exploring learning theory, human-based computation and participant motivation the paper develops a procedure for deploying collaborative functions within the interface design of the RADB Management System. The paper presents the results of the collaborative model implementation and discusses considerations for the next iteration of the RADB Universe within an Agile Development Approach.
Ignatova, Zoya; Zimmermann, Karl-Heinz
In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.
RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this work, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https://github.com/thucombio/deepnet-rbp.
Käser, Tanja; Busetto, Alberto Giovanni; Solenthaler, Barbara; Baschera, Gian-Marco; Kohn, Juliane; Kucian, Karin; von Aster, Michael; Gross, Markus
This study introduces a student model and control algorithm, optimizing mathematics learning in children. The adaptive system is integrated into a computer-based training system for enhancing numerical cognition aimed at children with developmental dyscalculia or difficulties in learning mathematics. The student model consists of a dynamic…
Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.
The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.
Online gaming has become a very popular leisure activity among adolescents. Research suggests that a small minority of adolescents may display problematic gaming behaviour and that some of these individuals may be addicted to online games, including those who have learning disabilities. This article begins by examining a case study of a 15-year old adolescent with a learning disability who appeared to be addicted to various computer and internet applications. Despite the potential negative ef...
This study examines the effect of three different computer integration models on pre-service mathematics teachers' beliefs about using computers in mathematics education. Participants included 104 pre-service mathematics teachers (36 second-year students in the Computer Oriented Model group, 35 fourth-year students in the Integrated Model (IM)…
Understanding how visual recognition is achieved in the human brain is one of the most fundamental questions in vision research. In this thesis I seek to tackle this problem from a neurocomputational modeling perspective. More specifically, I build machine learning-based models to simulate and explain cognitive phenomena related to human visual recognition, and I improve computational models using brain-inspired principles to excel at computer vision tasks.I first describe how a neurocomputat...
Full Text Available In this paper, we present the methodology for the introduction to scientific computing based on model-centered learning. We propose multiphase queueing systems as a basis for learning objects. We use Python and parallel programming for implementing the models and present the computer code and results of stochastic simulations.
Looks, Moshe; Herreshoff, Marcello; Hutchins, DeLesley; Norvig, Peter
Neural networks that compute over graph structures are a natural fit for problems in a variety of domains, including natural language (parse trees) and cheminformatics (molecular graphs). However, since the computation graph has a different shape and size for every input, such networks do not directly support batched training or inference. They are also difficult to implement in popular deep learning libraries, which are based on static data-flow graphs. We introduce a technique called dynami...
Selviandro , Nungki; Hasibuan , Zainal ,
Part 1: Information and Communication Technology- Eurasia Conference (ICT-EurAsia); International audience; The increasing research in the areas of information technology have a positive impact in the world of education. The implementation of e-learning is one of contribution from information technology to the world of education. The implementation of e-learning has been implemented by several educational institutions in Indonesia. E-Learning provides many benefits such as flexibility, divers...
McCauley, Stewart M; Christiansen, Morten H
Second-language learners rarely arrive at native proficiency in a number of linguistic domains, including morphological and syntactic processing. Previous approaches to understanding the different outcomes of first- versus second-language learning have focused on cognitive and neural factors. In contrast, we explore the possibility that children and adults may rely on different linguistic units throughout the course of language learning, with specific focus on the granularity of those units. Following recent psycholinguistic evidence for the role of multiword chunks in online language processing, we explore the hypothesis that children rely more heavily on multiword units in language learning than do adults learning a second language. To this end, we take an initial step toward using large-scale, corpus-based computational modeling as a tool for exploring the granularity of speakers' linguistic units. Employing a computational model of language learning, the Chunk-Based Learner, we compare the usefulness of chunk-based knowledge in accounting for the speech of second-language learners versus children and adults speaking their first language. Our findings suggest that while multiword units are likely to play a role in second-language learning, adults may learn less useful chunks, rely on them to a lesser extent, and arrive at them through different means than children learning a first language. Copyright © 2017 Cognitive Science Society, Inc.
Myers, Catherine E.; Smith, Ian M.; Servatius, Richard J.; Beck, Kevin D.
Avoidance behaviors, in which a learned response causes omission of an upcoming punisher, are a core feature of many psychiatric disorders. While reinforcement learning (RL) models have been widely used to study the development of appetitive behaviors, less attention has been paid to avoidance. Here, we present a RL model of lever-press avoidance learning in Sprague-Dawley (SD) rats and in the inbred Wistar Kyoto (WKY) rat, which has been proposed as a model of anxiety vulnerability. We focus on “warm-up,” transiently decreased avoidance responding at the start of a testing session, which is shown by SD but not WKY rats. We first show that a RL model can correctly simulate key aspects of acquisition, extinction, and warm-up in SD rats; we then show that WKY behavior can be simulated by altering three model parameters, which respectively govern the tendency to explore new behaviors vs. exploit previously reinforced ones, the tendency to repeat previous behaviors regardless of reinforcement, and the learning rate for predicting future outcomes. This suggests that several, dissociable mechanisms may contribute independently to strain differences in behavior. The model predicts that, if the “standard” inter-session interval is shortened from 48 to 24 h, SD rats (but not WKY) will continue to show warm-up; we confirm this prediction in an empirical study with SD and WKY rats. The model further predicts that SD rats will continue to show warm-up with inter-session intervals as short as a few minutes, while WKY rats will not show warm-up, even with inter-session intervals as long as a month. Together, the modeling and empirical data indicate that strain differences in warm-up are qualitative rather than just the result of differential sensitivity to task variables. Understanding the mechanisms that govern expression of warm-up behavior in avoidance may lead to better understanding of pathological avoidance, and potential pathways to modify these processes. PMID
Full Text Available I review some work on models of quantum computing, optical implementations of these models, as well as the associated computational power. In particular, we discuss the circuit model and cluster state implementations using quantum optics with various encodings such as dual rail encoding, Gottesman-Kitaev-Preskill encoding, and coherent state encoding. Then we discuss intermediate models of optical computing such as boson sampling and its variants. Finally, we review some recent work in optical implementations of adiabatic quantum computing and analog optical computing. We also provide a brief description of the relevant aspects from complexity theory needed to understand the results surveyed.
Adriaans, P.; Adriaans, P.; van Benthem, J.
In the summer of 1956, a number of scientists gathered at the Dartmouth College in Hanover, New Hampshire. Their goal was to study human intelligence with the help of computers. Their central hypothesis was: "that every aspect of learning or any other feature of intelligence can in principle be so
Planinsic, G.; Gregorcic, B.; Etkina, E.
This paper introduces the readers to simple inquiry-based activities (experiments with supporting questions) that one can do with a computer scanner to help students learn and apply the concepts of relative motion in 1 and 2D, vibrational motion and the Doppler effect. We also show how to use these activities to help students think like…
Studies of the impact of various types of computer use on the results of learning and student motivation have indicated that the use of computers can increase learning motivation, and that computers can have a positive effect, a negative effect, or no effect at all on learning outcomes. Some results indicate that it is not computer use itself that…
Jones, Gary; Gobet, Fernand; Pine, Julian M.
The nonword repetition (NWR) test has been shown to be a good predictor of children's vocabulary size. NWR performance has been explained using phonological working memory, which is seen as a critical component in the learning of new words. However, no detailed specification of the link between phonological working memory and long-term memory…
Rey, Gunter Daniel
Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…
Full Text Available Individual behavioral differences in humans have been linked to measurable differences in their mental activities, including differences in their implicit motives. In humans, individual differences in the strength of motives such as power, achievement and affiliation have been shown to have a significant impact on behavior in social dilemma games and during other kinds of strategic interactions. This paper presents agent-based computational models of power-, achievement- and affiliation-motivated individuals engaged in game-play. The first model captures learning by motivated agents during strategic interactions. The second model captures the evolution of a society of motivated agents. It is demonstrated that misperception, when it is a result of motivation, causes agents with different motives to play a given game differently. When motivated agents who misperceive a game are present in a population, higher explicit payoff can result for the population as a whole. The implications of these results are discussed, both for modeling human behavior and for designing artificial agents with certain salient behavioral characteristics.
In this work, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information
Umam, K.; Mardi, S. N. S.; Hariadi, M.
Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning
Siegelmann-Danieli and H.T. Siegelmann, "Robust Artificial Life Via Artificial Programmed Death," Artificial Inteligence 172(6-7), April 2008: 884-898. F...STATEMENT Unrestricted 13. SUPPLEMENTARY NOTES 20100402019 14. ABSTRACT Memory models are central to Artificial Intelligence and Machine...beyond . The advances cited are a significant step toward creating Artificial Intelligence via neural networks at the human level. Our network
Taylor, Estelle; Breed, Marnus; Hauman, Ilette; Homann, Armando
Our aim is to determine which teaching methods students in Computer Science and Information Systems prefer. There are in total 5 different paradigms (behaviorism, cognitivism, constructivism, design-based and humanism) with 32 models between them. Each model is unique and states different learning methods. Recommendations are made on methods that…
Chariker, Julia H.; Naaz, Farah; Pani, John R.
A longitudinal experiment was conducted to evaluate the effectiveness of new methods for learning neuroanatomy with computer-based instruction. Using a three-dimensional graphical model of the human brain and sections derived from the model, tools for exploring neuroanatomy were developed to encourage "adaptive exploration". This is an…
An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs
This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..
Kreijns, C.J.; Kirschner, P.A.; Jochems, W.M.G.
There is much positive research on computer-supported collaborative learning (CSCL) environments in asynchronous distributed learning groups (DLGs). There is also research that shows that contemporary CSCL environments do not completely fulfil expectations on supporting interactive group learning,
Havsteen, Inger; Christensen, Anders; Nielsen, Jens K
BACKGROUND: Computed tomographic angiography (CTA) is widely available in emergency rooms to assess acute stroke patients. To standardize readings and educate new readers, we developed a 3-step e-learning tool based on the test-teach-retest methodology in 2 acute stroke scenarios: vascular...... occlusion and "spot sign" in acute intracerebral hemorrhage. We hypothesized that an e-learning program enhances reading skills in physicians of varying experience. METHODS: We developed an HTML-based program with a teaching segment and 2 matching test segments. Tests were taken before and after...... sign correctly 69% before versus 92% after teaching (P = .009) and reported a median self-perceived diagnostic certainty of 50% versus 75% (P = .030). Self-perceived diagnostic certainty revealed no significant increase for vascular occlusion. CONCLUSIONS: The e-learning program is a useful educational...
The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit
-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three
Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar
Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...
Full Text Available Training teachers to assess important components of self-regulated learning such as learning strategies is an important, yet somewhat neglected, aspect of the integration of self-regulated learning at school. Learning journals can be used to assess learning strategies in line with cyclical process models of self-regulated learning, allowing for rich formative feedback. Against this background, we developed a computer-based learning environment (CBLE that trains teachers to assess learning strategies with learning journals. The contents of the CBLE and its instructional design were derived from theory. The CBLE was further shaped by research in a design-based manner. Finally, in two evaluation studies, student teachers (N1=44; N2=89 worked with the CBLE. We analyzed satisfaction, interest, usability, and assessment skills. Additionally, in evaluation study 2, effects of an experimental variation on motivation and assessment skills were tested. We found high satisfaction, interest, and good usability, as well as satisfying assessment skills, after working with the CBLE. Results show that teachers can be trained to assess learning strategies in learning journals. The developed CBLE offers new perspectives on how to support teachers in fostering learning strategies as central component of effective self-regulated learning at school.
Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul
Models are playing important roles in design and analysis of chemicals based products and the processes that manufacture them. Computer-aided methods and tools have the potential to reduce the number of experiments, which can be expensive and time consuming, and there is a benefit of working...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...
van Rossum, Mark C. W.; Shippi, Maria
One of our most intriguing mental abilities is the capacity to store information and recall it from memory. Computational neuroscience has been influential in developing models and concepts of learning and memory. In this tutorial review we focus on the interplay between learning and forgetting. We discuss recent advances in the computational description of the learning and forgetting processes on synaptic, neuronal, and systems levels, as well as recent data that open up new challenges for statistical physicists.
Van Rossum, Mark C W; Shippi, Maria
One of our most intriguing mental abilities is the capacity to store information and recall it from memory. Computational neuroscience has been influential in developing models and concepts of learning and memory. In this tutorial review we focus on the interplay between learning and forgetting. We discuss recent advances in the computational description of the learning and forgetting processes on synaptic, neuronal, and systems levels, as well as recent data that open up new challenges for statistical physicists. (paper)
Burgos, Daniel; Tattersall, Colin; Koper, Rob
Burgos, D., Tattersall, C., & Koper, E. J. R. (2007). Representing adaptive and adaptable Units of Learning. How to model personalized eLearning in IMS Learning Design. In B. Fernández Manjon, J. M. Sanchez Perez, J. A. Gómez Pulido, M. A. Vega Rodriguez & J. Bravo (Eds.), Computers and Education: E-learning - from theory to practice. Germany: Kluwer.
Leh, Amy S.C.; Kouba, Barbara; Davis, Dirk
Advanced technology makes 21st century learning, communities and interactions unique and leads people to an era of ubiquitous computing. The purpose of this article is to contribute to the discussion of learning in the 21st century. The paper will review literature on learning community, community learning, interaction, 21st century learning and…
Bao, Yukun; Xiong, Tao; Hu, Zhongyi; Kibelloh, Mboni
Reasons for contradictory findings regarding the gender moderate effect on computer self-efficacy in the adoption of e-learning/mobile learning are limited. Recognizing the multilevel nature of the computer self-efficacy (CSE), this study attempts to explore gender differences in the adoption of mobile learning, by extending the Technology Acceptance Model (TAM) with general and specific CSE. Data collected from 137 university students were tested against the research model using the structur...
Itu, Lucian; Rapaka, Saikiran; Passerini, Tiziano; Georgescu, Bogdan; Schwemmer, Chris; Schoebinger, Max; Flohr, Thomas; Sharma, Puneet; Comaniciu, Dorin
Fractional flow reserve (FFR) is a functional index quantifying the severity of coronary artery lesions and is clinically obtained using an invasive, catheter-based measurement. Recently, physics-based models have shown great promise in being able to noninvasively estimate FFR from patient-specific anatomical information, e.g., obtained from computed tomography scans of the heart and the coronary arteries. However, these models have high computational demand, limiting their clinical adoption. In this paper, we present a machine-learning-based model for predicting FFR as an alternative to physics-based approaches. The model is trained on a large database of synthetically generated coronary anatomies, where the target values are computed using the physics-based model. The trained model predicts FFR at each point along the centerline of the coronary tree, and its performance was assessed by comparing the predictions against physics-based computations and against invasively measured FFR for 87 patients and 125 lesions in total. Correlation between machine-learning and physics-based predictions was excellent (0.9994, P machine-learning algorithm with a sensitivity of 81.6%, a specificity of 83.9%, and an accuracy of 83.2%. The correlation was 0.729 (P assessment of FFR. Average execution time went down from 196.3 ± 78.5 s for the CFD model to ∼2.4 ± 0.44 s for the machine-learning model on a workstation with 3.4-GHz Intel i7 8-core processor. Copyright © 2016 the American Physiological Society.
Carlos Antonio Bruno da Silva
Full Text Available Computer games and mainly videogames have proved to be an important tendency in Brazilian children’s play. They are part of the playful culture, which associates modern technology to traditional play preserving the importance of the latter. Based on Vygotsky and Chadwick’s ideas, this work studies the alternatives in the use of videogame by the occupational therapist, educator or parents, aiming prevention of learning difficulty by means of apprehension of learning strategies. Sixty children were investigated under dialectic, descriptive qualitative/quantitative focus. There was a semi-structured interview, direct observation and focused group applied to this intentional sample. Out of the 60 children playing in 3 videogame rental shops in Fortaleza-CE and Quixadá-CE, 30 aged 4 to 6 years old and the other 30 aged 7 and 8. Results indicate that the determination that the videogame is played in-group favors the apprehension of learning and affective strategies, processing, and meta-cognition. Therefore, videogame can be considered an excellent resource in terms of preventing learning difficulties, enabling children to their reality.
The mathematical background for a multiport-network-solving program is described. A method for accurately numerically modeling an arbitrary, continuous, multiport transmission line is discussed. A modification to the transmission-line equations to accommodate multiple rf drives is presented. An improved model for the radio-frequency quadrupole (RFQ) accelerator that corrects previous errors is given. This model permits treating the RFQ as a true eight-port network for simplicity in interpreting the field distribution and ensures that all modes propagate at the same velocity in the high-frequency limit. The flexibility of the multiport model is illustrated by simple modifications to otherwise two-dimensional systems that permit modeling them as linear chains of multiport networks
Shimobaba, Tomoyoshi; Endo, Yutaka; Nishitsuji, Takashi; Takahashi, Takayuki; Nagahama, Yuki; Hasegawa, Satoki; Sano, Marie; Hirayama, Ryuji; Kakue, Takashi; Shiraki, Atsushi; Ito, Tomoyoshi
Computational ghost imaging (CGI) is a single-pixel imaging technique that exploits the correlation between known random patterns and the measured intensity of light transmitted (or reflected) by an object. Although CGI can obtain two- or three-dimensional images with a single or a few bucket detectors, the quality of the reconstructed images is reduced by noise due to the reconstruction of images from random patterns. In this study, we improve the quality of CGI images using deep learning. A deep neural network is used to automatically learn the features of noise-contaminated CGI images. After training, the network is able to predict low-noise images from new noise-contaminated CGI images.
Lewandowski, Beth E.; Griffin, Devon W.
The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.
Full Text Available This study attempted to test whether the use of computer-assisted language learning (CALL and innovative collaborative learning could be more effective than the use of traditional collaborative learning in improving students’ English proficiencies. A true experimental design was used in the study. Four randomly-assigned groups participated in the study: a traditional collaborative learning group (TCLG, 34 students, an innovative collaborative learning group (ICLG, 31 students, a CALL traditional collaborative learning group (CALLTCLG, 32 students, and a CALL innovative collaborative learning group (CALLICLG, 31 students. TOEIC (Test of English for International Communication listening, reading, speaking, and writing pre-test and post-test assessments were given to all students at an interval of sixteen weeks. Multivariate analysis of covariance (MANCOVA, multivariate analysis of variance (MANOVA, and analysis of variance (ANOVA were used to analyze the data. The results revealed that students who used CALL had significantly better learning performance than those who did not. Students in innovative collaborative learning had significantly better learning performances than those in traditional collaborative learning. Additionally, students using CALL innovative collaborative learning had better learning performances than those in CALL collaborative learning, those in innovative collaborative learning, and those in traditional collaborative learning.
The modelling of the greatest part of environmental or industrial flow problems gives very similar types of equations. The considerable increase in computing capacity over the last ten years consequently allowed numerical models of growing complexity to be processed. The varied group of computer codes presented are now a complementary tool of experimental facilities to achieve studies in the field of fluid mechanics. Several codes applied in the nuclear field (reactors, cooling towers, exchangers, plumes...) are presented among others [fr
Stockwell, Glenn, Ed.
Computer-assisted language learning (CALL) is an approach to teaching and learning languages that uses computers and other technologies to present, reinforce, and assess material to be learned, or to create environments where teachers and learners can interact with one another and the outside world. This book provides a much-needed overview of the…
Chaos is one of the major scientific discoveries of our times. In fact many scientists ... But there are other natural phenomena that are not predictable though ... characteristics of chaos. ... The position and velocity are all that are needed to determine the motion of a .... a system of equations that modelled the earth's weather ...
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.
This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.
Moran, B.; Reaugh, J. E.
A computer model is described for predicting ductile fracture initiation and propagation. The computer fracture model is calibrated by simple and notched round-bar tension tests and a precracked compact tension test. The model is used to predict fracture initiation and propagation in a Charpy specimen and compare the results with experiments. The calibrated model provides a correlation between Charpy V-notch (CVN) fracture energy and any measure of fracture toughness, such as J/sub Ic/. A second simpler empirical correlation was obtained using the energy to initiate fracture in the Charpy specimen rather than total energy CVN, and compared the results with the empirical correlation of Rolfe and Novak
Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro
We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.......We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models....
Chamon, Claudio; Mucciolo, Eduardo; Ruckenstein, Andrei; Yang, Zhicheng
We present a planar vertex model that encodes the result of a universal reversible classical computation in its ground state. The approach involves Boolean variables (spins) placed on links of a two-dimensional lattice, with vertices representing logic gates. Large short-ranged interactions between at most two spins implement the operation of each gate. The lattice is anisotropic with one direction corresponding to computational time, and with transverse boundaries storing the computation's input and output. The model displays no finite temperature phase transitions, including no glass transitions, independent of circuit. The computational complexity is encoded in the scaling of the relaxation rate into the ground state with the system size. We use thermal annealing and a novel and more efficient heuristic \\x9Dannealing with learning to study various computational problems. To explore faster relaxation routes, we construct an explicit mapping of the vertex model into the Chimera architecture of the D-Wave machine, initiating a novel approach to reversible classical computation based on quantum annealing.
Seyedeh Shohreh Alavi
Full Text Available Background and purpose: Presently, the method of medical teaching has shifted from lecture-based to computer-based. The learning style may play a key role in the attitude toward learning computer. The goal of this study was to study the relationship between the learning style and attitude toward computer among Iranian medical students.Methods: This cross-sectional study included 400 medical students. Barsch learning style inventory and a questionnaire on the attitude toward computer was sent to each student. The enthusiasm, anxiety, and overall attitude toward computer were compared among the different learning styles.Results: The response rate to the questionnaire was 91.8%. The distribution of learning styles in the students was 181 (49.3% visual, 106 (28.9% auditory, 27 (7.4% tactual, and 53 (14.4% overall. Visual learners were less anxious for computer use and showed more positive attitude toward computer. Sex, age, and academic grade were not associated with students’ attitude toward computer.Conclusions: The learning style is an important factor in the students’ attitude toward computer among medical students, which should be considered in planning computer-based learning programs.Keywords: LEARNING STYLE, ATTITUDE, COMPUTER, MEDICAL STUDENT, ANXIETY, ENTHUSIASM
Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.
Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A
While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.
Sison , Raymund; Shimura , Masamichi
After identifying essential student modeling issues and machine learning approaches, this paper examines how machine learning techniques have been used to automate the construction of student models as well as the background knowledge necessary for student modeling. In the process, the paper sheds light on the difficulty, suitability and potential of using machine learning for student modeling processes, and, to a lesser extent, the potential of using student modeling techniques in machine le...
Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick
Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.
Sisson, Lee Hansen; And Others
This paper describes the use of commercially-available software for the Apple Computer to augment diagnostic evaluations of learning disabled children and to enhance "learning to learn" strategies at the application/transfer level of learning. A short rationale discusses levels of evaluation and learning, using a model that synthesizes the ideas…
Considering the rapidly growing amount of digital educational materials only few of them bridge the gab between experimental learning environments and computer based learning environments (Gardner, 1991). Observations from two cases in primary school and lower secondary school in the subject...... with a prototype of a MOO storyline. The aim of the MOO storyline is to challenge the potential of dialogue, user involvement, and learning responsibility and to use the children?s natural curiosity and motivation for game playing, especially when digital games involves other children. The paper proposes a model......, based on the narrative approach for experimental learning subjects, relying on ideas from Csikszentmihalyis notion of flow (Csikszentmihalyi, 1991), storyline-pedagogy (Meldgaard, 1994) and ideas from Howard Gardner (Gardner, 1991). The model forms the basis for educational games to be used in home...
Please, cite this publication as: Kremenska, A. (2006). Computer Assisted Language Learning (CALL): Using Internet for Effective Language Learning. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence Conference. March 30th-31st, Sofia,
Mustard, John F.
In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.
Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.
Krukow, Karl; Nielsen, Mogens; Sassone, Vladimiro
We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models.
Besmann, Theodore M.
This chapter considers methods and techniques for computational modeling for nuclear materials with a focus on fuels. The basic concepts for chemical thermodynamics are described and various current models for complex crystalline and liquid phases are illustrated. Also included are descriptions of available databases for use in chemical thermodynamic studies and commercial codes for performing complex equilibrium calculations.
Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:
Trausan-Matu, S. (2008). Analysis and Assessment of Computer-Supported Collaborative Learning Conversations. Workshop presentation at the symposium Learning networks for professional. November, 14, 2008, Heerlen, Nederland: Open Universiteit Nederland.
Introduction of computer-supported collaborative learning (CSCL), specifically in an intercultural learning environment, creates both challenges and benefits. Among the challenges are the coordination of different attitudes, styles of communication, and patterns of behaving. Among the benefits are
Cantwell, George; Riesenhuber, Maximilian; Roeder, Jessica L; Ashby, F Gregory
The field of computational cognitive neuroscience (CCN) builds and tests neurobiologically detailed computational models that account for both behavioral and neuroscience data. This article leverages a key advantage of CCN-namely, that it should be possible to interface different CCN models in a plug-and-play fashion-to produce a new and biologically detailed model of perceptual category learning. The new model was created from two existing CCN models: the HMAX model of visual object processing and the COVIS model of category learning. Using bitmap images as inputs and by adjusting only a couple of learning-rate parameters, the new HMAX/COVIS model provides impressively good fits to human category-learning data from two qualitatively different experiments that used different types of category structures and different types of visual stimuli. Overall, the model provides a comprehensive neural and behavioral account of basal ganglia-mediated learning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.
Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.
Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.
Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...
Modeling Reality: How Computers Mirror Life covers a wide range of modern subjects in complex systems, suitable not only for undergraduate students who want to learn about modelling 'reality' by using computer simulations, but also for researchers who want to learn something about subjects outside of their majors and need a simple guide. Readers are not required to have specialized training before they start the book. Each chapter is organized so as to train the reader to grasp the essential idea of simulating phenomena and guide him/her towards more advanced areas. The topics presented in this textbook fall into two categories. The first is at graduate level, namely probability, statistics, information theory, graph theory, and the Turing machine, which are standard topics in the course of information science and information engineering departments. The second addresses more advanced topics, namely cellular automata, deterministic chaos, fractals, game theory, neural networks, and genetic algorithms. Several topics included here (neural networks, game theory, information processing, etc) are now some of the main subjects of statistical mechanics, and many papers related to these interdisciplinary fields are published in Journal of Physics A: Mathematical and General, so readers of this journal will be familiar with the subject areas of this book. However, each area is restricted to an elementary level and if readers wish to know more about the topics they are interested in, they will need more advanced books. For example, on neural networks, the text deals with the back-propagation algorithm for perceptron learning. Nowadays, however, this is a rather old topic, so the reader might well choose, for example, Introduction to the Theory of Neural Computation by J Hertz et al (Perseus books, 1991) or Statistical Physics of Spin Glasses and Information Processing by H Nishimori (Oxford University Press, 2001) for further reading. Nevertheless, this book is worthwhile
Blain, John M
Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments
Xie, Danfeng; Zhang, Lei; Bai, Li
Deep learning is a subfield of machine learning, which aims to learn a hierarchy of features from input data. Nowadays, researchers have intensively investigated deep learning algorithms for solving challenging problems in many areas such as image classification, speech recognition, signal processing, and natural language processing. In this study, we not only review typical deep learning algorithms in computer vision and signal processing but also provide detailed information on how to apply...
Forgasz, Helen J.
Equity and computer use for secondary mathematics learning was the focus of a three year study. In 2003, a survey was administered to a large sample of grade 7-10 students. Some of the survey items were aimed at determining home access to and ownership of computers, and students' attitudes to mathematics, computers, and computer use for…
Sorensen, Birgitte Holm
Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…
Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh
Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...
Burgos, Daniel; Tattersall, Colin; Koper, Rob
Burgos, D., Tattersall, C., & Koper, E. J. R. (2007). Representing adaptive and adaptable Units of Learning. How to model personalized eLearning in IMS Learning Design. In B. Fernández Manjon, J. M. Sanchez Perez, J. A. Gómez Pulido, M. A. Vega Rodriguez & J. Bravo (Eds.), Computers and Education:
Tu, Jiyuan; Wong, Kelvin Kian Loong
This book discusses geometric and mathematical models that can be used to study fluid and structural mechanics in the cardiovascular system. Where traditional research methodologies in the human cardiovascular system are challenging due to its invasive nature, several recent advances in medical imaging and computational fluid and solid mechanics modelling now provide new and exciting research opportunities. This emerging field of study is multi-disciplinary, involving numerical methods, computational science, fluid and structural mechanics, and biomedical engineering. Certainly any new student or researcher in this field may feel overwhelmed by the wide range of disciplines that need to be understood. This unique book is one of the first to bring together knowledge from multiple disciplines, providing a starting point to each of the individual disciplines involved, attempting to ease the steep learning curve. This book presents elementary knowledge on the physiology of the cardiovascular system; basic knowl...
Full Text Available Kai-Lung Hua,1 Che-Hao Hsu,1 Shintami Chusnul Hidayati,1 Wen-Huang Cheng,2 Yu-Jen Chen3 1Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, 2Research Center for Information Technology Innovation, Academia Sinica, 3Department of Radiation Oncology, MacKay Memorial Hospital, Taipei, Taiwan Abstract: Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. Keywords: nodule classification, deep learning, deep belief network, convolutional neural network
Caremoli, Ch.; Erhard, P.
Today's computers have the processing power to deliver detailed and global simulations of complex industrial processes such as the operation of a nuclear reactor core. So should we be producing new, global numerical models to take full advantage of this new-found power? If so, it would be a long-term job. There is, however, another solution; to couple the existing validated numerical models together so that they work as one. (authors)
Full Text Available The need for extended liver resection is increasing due to the growing incidence of liver tumors in aging societies. Individualized surgical planning is the key for identifying the optimal resection strategy and to minimize the risk of postoperative liver failure and tumor recurrence. Current computational tools provide virtual planning of liver resection by taking into account the spatial relationship between the tumor and the hepatic vascular trees, as well as the size of the future liver remnant. However, size and function of the liver are not necessarily equivalent. Hence, determining the future liver volume might misestimate the future liver function, especially in cases of hepatic comorbidities such as hepatic steatosis. A systems medicine approach could be applied, including biological, medical, and surgical aspects, by integrating all available anatomical and functional information of the individual patient. Such an approach holds promise for better prediction of postoperative liver function and hence improved risk assessment. This review provides an overview of mathematical models related to the liver and its function and explores their potential relevance for computational liver surgery. We first summarize key facts of hepatic anatomy, physiology, and pathology relevant for hepatic surgery, followed by a description of the computational tools currently used in liver surgical planning. Then we present selected state-of-the-art computational liver models potentially useful to support liver surgery. Finally, we discuss the main challenges that will need to be addressed when developing advanced computational planning tools in the context of liver surgery.
van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria
In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results
E-learning systems usually require many hardware and software resources. There are many educational institutions that cannot afford such investments, and cloud computing is the best solution. This paper presents the impact on using cloud computing for e-learning solutions.
Roberts, Tim, Ed.
"Computer-Supported Collaborative Learning in Higher Education" provides a resource for researchers and practitioners in the area of computer-supported collaborative learning (also known as CSCL); particularly those working within a tertiary education environment. It includes articles of relevance to those interested in both theory and practice in…
Mohammed M. Eissa
Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.
Reng, Lars; Schoenau-Fog, Henrik
will describe the levels of the model, which is based on our experience in teaching professional game development at university level. Furthermore, we have been using the model to inspire numerous educators to improve their students’ motivation and skills. The model presents various game-based learning...... activities, and depicts their required planning and expected outcome through eight levels. At its lower levels, the model contains the possibilities of using stand-alone analogue and digital games as teachers, utilizing games as a facilitator of learning activities, exploiting gamification and motivating......In this paper, we will introduce the Game Enhanced learning Model (GEM), which describes a range of gameoriented learning activities. The model is intended to give an overview of the possibilities of game-based learning in general and all the way up to purposive game productions. In the paper, we...
construction, I stress the design of a test theory, namely, a learning algorithm. The learning algorithm is designed under such principles that users experience both 'elaborative rehearsal’ (aspects in receptive and productive learning) and 'expanding rehearsal, (memory-based learning and repetitive act...
Full Text Available Adolescence is a period of life characterised by changes in learning and decision-making. Learning and decision-making do not rely on a unitary system, but instead require the coordination of different cognitive processes that can be mathematically formalised as dissociable computational modules. Here, we aimed to trace the developmental time-course of the computational modules responsible for learning from reward or punishment, and learning from counterfactual feedback. Adolescents and adults carried out a novel reinforcement learning paradigm in which participants learned the association between cues and probabilistic outcomes, where the outcomes differed in valence (reward versus punishment and feedback was either partial or complete (either the outcome of the chosen option only, or the outcomes of both the chosen and unchosen option, were displayed. Computational strategies changed during development: whereas adolescents' behaviour was better explained by a basic reinforcement learning algorithm, adults' behaviour integrated increasingly complex computational features, namely a counterfactual learning module (enabling enhanced performance in the presence of complete feedback and a value contextualisation module (enabling symmetrical reward and punishment learning. Unlike adults, adolescent performance did not benefit from counterfactual (complete feedback. In addition, while adults learned symmetrically from both reward and punishment, adolescents learned from reward but were less likely to learn from punishment. This tendency to rely on rewards and not to consider alternative consequences of actions might contribute to our understanding of decision-making in adolescence.
Shaalan, Khaled F.
This paper describes the development of an intelligent computer-assisted language learning (ICALL) system for learning Arabic. This system could be used for learning Arabic by students at primary schools or by learners of Arabic as a second or foreign language. It explores the use of Natural Language Processing (NLP) techniques for learning…
This paper reviews the learning theories, focusing to the strong interest in technology use for language learning. It is important to look at how technology has been used in the field thus far. The goals of this review are to understand how computers have been used in the past years to support foreign language learning, and to explore any research evidence with regards to how computer technology can enhance language skills acquisition
Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf
Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.
Goldsby, Michael E.; Armstrong, Robert C.; Shneider, Max S.; Vanderveen, Keith; Ray, Jaideep; Heath, Zach; Allan, Benjamin A.
This report presents the results of our efforts to apply high-performance computing to entity-based simulations with a multi-use plugin for parallel computing. We use the term 'Entity-based simulation' to describe a class of simulation which includes both discrete event simulation and agent based simulation. What simulations of this class share, and what differs from more traditional models, is that the result sought is emergent from a large number of contributing entities. Logistic, economic and social simulations are members of this class where things or people are organized or self-organize to produce a solution. Entity-based problems never have an a priori ergodic principle that will greatly simplify calculations. Because the results of entity-based simulations can only be realized at scale, scalable computing is de rigueur for large problems. Having said that, the absence of a spatial organizing principal makes the decomposition of the problem onto processors problematic. In addition, practitioners in this domain commonly use the Java programming language which presents its own problems in a high-performance setting. The plugin we have developed, called the Parallel Particle Data Model, overcomes both of these obstacles and is now being used by two Sandia frameworks: the Decision Analysis Center, and the Seldon social simulation facility. While the ability to engage U.S.-sized problems is now available to the Decision Analysis Center, this plugin is central to the success of Seldon. Because Seldon relies on computationally intensive cognitive sub-models, this work is necessary to achieve the scale necessary for realistic results. With the recent upheavals in the financial markets, and the inscrutability of terrorist activity, this simulation domain will likely need a capability with ever greater fidelity. High-performance computing will play an important part in enabling that greater fidelity.
We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps
Ito, Makoto; Doya, Kenji
Computational models of reinforcement learning have recently been applied to analysis of brain imaging and neural recording data to identity neural correlates of specific processes of decision making, such as valuation of action candidates and parameters of value learning. However, for such model-based analysis paradigms, selecting an appropriate model is crucial. In this study we analyze the process of choice learning in rats using stochastic rewards. We show that "Q-learning," which is a standard reinforcement learning algorithm, does not adequately reflect the features of choice behaviors. Thus, we propose a generalized reinforcement learning (GRL) algorithm that incorporates the negative reward effect of reward loss and forgetting of values of actions not chosen. Using the Bayesian estimation method for time-varying parameters, we demonstrated that the GRL algorithm can predict an animal's choice behaviors as efficiently as the best Markov model. The results suggest the usefulness of the GRL for the model-based analysis of neural processes involved in decision making.
May, Dave A.; Spiegelman, Marc
Practitioners in computational geodynamics, as per many other branches of applied science, typically do not analyse the underlying PDE's being solved in order to establish the existence or uniqueness of solutions. Rather, such proofs are left to the mathematicians, and all too frequently these results lag far behind (in time) the applied research being conducted, are often unintelligible to the non-specialist, are buried in journals applied scientists simply do not read, or simply have not been proven. As practitioners, we are by definition pragmatic. Thus, rather than first analysing our PDE's, we first attempt to find approximate solutions by throwing all our computational methods and machinery at the given problem and hoping for the best. Typically this approach leads to a satisfactory outcome. Usually it is only if the numerical solutions "look odd" that we start delving deeper into the math. In this presentation I summarise our findings in relation to using pressure dependent (Drucker-Prager type) flow laws in a simplified model of continental extension in which the material is assumed to be an incompressible, highly viscous fluid. Such assumptions represent the current mainstream adopted in computational studies of mantle and lithosphere deformation within our community. In short, we conclude that for the parameter range of cohesion and friction angle relevant to studying rocks, the incompressibility constraint combined with a Drucker-Prager flow law can result in problems which have no solution. This is proven by a 1D analytic model and convincingly demonstrated by 2D numerical simulations. To date, we do not have a robust "fix" for this fundamental problem. The intent of this submission is to highlight the importance of simple analytic models, highlight some of the dangers / risks of interpreting numerical solutions without understanding the properties of the PDE we solved, and lastly to stimulate discussions to develop an improved computational model of
Drie, J.P. van
Recent technological developments have provided new environments for learning, giving rise to the question of how characteristics of such new learning environments can facilitate the process of learning in specific domains. The focus of this thesis is on computer-supported collaborative learning
Bishop, Christopher M
Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.
Developing countries have a responsibility not merely to provide computers for schools, but also to foster a habit of infusing a variety of ways in which computers can be integrated in teaching-learning amongst the end users of these tools. Earlier researches lacked a systematic study of the manner and the extent of computer-use by teachers. The…
Hall, T.; Bannon, L.
In recent years, novel paradigms of computing have emerged, which enable computational power to be embedded in artefacts and in environments in novel ways. These developments may create new possibilities for using computing to enhance learning. This paper presents the results of a design process that set out to explore interactive techniques,…
Washington Luna Encalada; José Luis Castillo Sequera
In recent years, we have seen a significant number of new technological ideas appearing in literature discussing the future of education. For example, E-learning, cloud computing, social networking, virtual laboratories, virtual realities, virtual worlds, massive open online courses (MOOCs), and bring your own device (BYOD) are all new concepts of immersive and global education that have emerged in educational literature. One of the greatest challenges presented to e-learning solutions is the...
Xhafa, Fatos [Polytechnic Univ. of Catalonia, Barcelona (Spain). Dept. of Languages and Informatics Systems; Caballe, Santi; Daradoumis, Thanasis [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Computer Sciences Multimedia and Telecommunications; Abraham, Ajith [Machine Intelligence Research Labs (MIR Labs), Auburn, WA (United States). Scientific Network for Innovation and Research Excellence; Juan Perez, Angel Alejandro (eds.) [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Information Sciences
E-Learning has become one of the most wide spread ways of distance teaching and learning. Technologies such as Web, Grid, and Mobile and Wireless networks are pushing teaching and learning communities to find new and intelligent ways of using these technologies to enhance teaching and learning activities. Indeed, these new technologies can play an important role in increasing the support to teachers and learners, to shorten the time to learning and teaching; yet, it is necessary to use intelligent techniques to take advantage of these new technologies to achieve the desired support to teachers and learners and enhance learners' performance in distributed learning environments. The chapters of this volume bring advances in using intelligent techniques for technology enhanced learning as well as development of e-Learning applications based on such techniques and supported by technology. Such intelligent techniques include clustering and classification for personalization of learning, intelligent context-aware techniques, adaptive learning, data mining techniques and ontologies in e-Learning systems, among others. Academics, scientists, software developers, teachers and tutors and students interested in e-Learning will find this book useful for their academic, research and practice activity. (orig.)
Mayer, Richard E.; Moreno, Roxana
Presents a cognitive theory of multimedia learning that draws on dual coding theory, cognitive load theory, and constructivist learning theory and derives some principles of instructional design for fostering multimedia learning. These include principles of multiple representation, contiguity, coherence, modality, and redundancy. (SLD)
Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick
A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and
Larsen, Lasse Juel; Løfgreen, Lars Bo
In the long-standing tradition for discounting digital technologies as a learning resource within the formal educational setting, computer games have often either been marked as distraction or totally ignored. However, as argued in the paradigmatic text by Shaffer, Squire, Halverson and Gee, Video...... Games and The Future of Learning, computer games do not only offer an interesting perspective on how “learners can understand complex concepts without losing the connection between abstract ideas and the real problems”, but can as well cast “a glimpse into how we might create new and more powerful ways...... to learn in schools, communities, and workplaces – new ways to learn for a new Information Age” . In line with this general approach to seeing computer games as a reservoir of learning strategies and potentials, this paper aims to examine how a specific computer game teach us how to play the game. [1...
Prata, David Nadler; Baker, Ryan S. J. d.; Costa, Evandro d. B.; Rose, Carolyn P.; Cui, Yue; de Carvalho, Adriana M. J. B.
This paper presents a model which can automatically detect a variety of student speech acts as students collaborate within a computer supported collaborative learning environment. In addition, an analysis is presented which gives substantial insight as to how students' learning is associated with students' speech acts, knowledge that will…
Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek
The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...
Quantum Machine Learning bridges the gap between abstract developments in quantum computing and the applied research on machine learning. Paring down the complexity of the disciplines involved, it focuses on providing a synthesis that explains the most important machine learning algorithms in a quantum framework. Theoretical advances in quantum computing are hard to follow for computer scientists, and sometimes even for researchers involved in the field. The lack of a step-by-step guide hampers the broader understanding of this emergent interdisciplinary body of research. Quantum Machine L
Kurt A. April; Katja M. J. Goebel; Eddie Blass; Jonathan Foster-Pedley
This paper explores the value that computer and video games bring to learning and leadership and explores how games work as learning environments and the impact they have on personal development. The study looks at decisiveness, decision-making ability and styles, and on how this leadership-related skill is learnt through different paradigms. The paper compares the learning from a lecture to the learning from a designed computer game, both of which have the same content through the use of a s...
One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in: (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...
Policymakers and education scholars recommend incorporating mathematical modeling into mathematics education. Limited implementation of modeling instruction in schools, however, has constrained research on how students learn to model, leaving unresolved debates about whether modeling should be reified and explicitly taught as a competence, whether…
Coley, Connor W; Green, William H; Jensen, Klavs F
Computer-aided synthesis planning (CASP) is focused on the goal of accelerating the process by which chemists decide how to synthesize small molecule compounds. The ideal CASP program would take a molecular structure as input and output a sorted list of detailed reaction schemes that each connect that target to purchasable starting materials via a series of chemically feasible reaction steps. Early work in this field relied on expert-crafted reaction rules and heuristics to describe possible retrosynthetic disconnections and selectivity rules but suffered from incompleteness, infeasible suggestions, and human bias. With the relatively recent availability of large reaction corpora (such as the United States Patent and Trademark Office (USPTO), Reaxys, and SciFinder databases), consisting of millions of tabulated reaction examples, it is now possible to construct and validate purely data-driven approaches to synthesis planning. As a result, synthesis planning has been opened to machine learning techniques, and the field is advancing rapidly. In this Account, we focus on two critical aspects of CASP and recent machine learning approaches to both challenges. First, we discuss the problem of retrosynthetic planning, which requires a recommender system to propose synthetic disconnections starting from a target molecule. We describe how the search strategy, necessary to overcome the exponential growth of the search space with increasing number of reaction steps, can be assisted through a learned synthetic complexity metric. We also describe how the recursive expansion can be performed by a straightforward nearest neighbor model that makes clever use of reaction data to generate high quality retrosynthetic disconnections. Second, we discuss the problem of anticipating the products of chemical reactions, which can be used to validate proposed reactions in a computer-generated synthesis plan (i.e., reduce false positives) to increase the likelihood of experimental success
Ross, Steven M.; McCormick, Deborah
Manipulation of flowcharting was crossed with in-class computer access to examine flowcharting effects in the traditional lecture/laboratory setting and in a classroom setting where online time was replaced with manual simulation. Seventy-two high school students (24 male and 48 female) enrolled in a computer literacy course served as subjects.…
Ozgen, Kemal; Bindak, Recep
The purpose of this study is to identify the opinions of high school students, who have different learning styles, related to computer use in mathematics education. High school students' opinions on computer use in mathematics education were collected with both qualitative and quantitative approaches in the study conducted with a survey model. For…
Garcia-Robles, R.; Diaz-del-Rio, F.; Vicente-Diaz, S.; Linares-Barranco, A.
Problem-based learning (PBL) has proved to be a highly successful pedagogical model in many fields, although it is not that common in computer engineering. PBL goes beyond the typical teaching methodology by promoting student interaction. This paper presents a PBL trial applied to a course in a computer engineering degree at the University of…
Full Text Available The paper presents the authors’ contributions for developing a computer code for teaching of descriptive geometry using the computer aided learning techniques. The program was implemented using the programming interface and the 3D modeling capabilities of the AutoCAD system.
Sreeramana Aithal; Vaikunth Pai T
An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...
Pearl, Lisa S; Sprouse, Jon
Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.
Van Niekerk, B
Full Text Available Constrained Model-based Reinforcement Learning Benjamin van Niekerk School of Computer Science University of the Witwatersrand South Africa Andreas Damianou∗ Amazon.com Cambridge, UK Benjamin Rosman Council for Scientific and Industrial Research, and School... MULTIPLE SHOOTING Using direct multiple shooting (Bock and Plitt, 1984), problem (1) can be transformed into a structured non- linear program (NLP). First, the time horizon [t0, t0 + T ] is partitioned into N equal subintervals [tk, tk+1] for k = 0...
Zeng, Jia; Cheung, William K; Liu, Jiming
Latent Dirichlet allocation (LDA) is an important hierarchical Bayesian model for probabilistic topic modeling, which attracts worldwide interest and touches on many important applications in text mining, computer vision and computational biology. This paper represents the collapsed LDA as a factor graph, which enables the classic loopy belief propagation (BP) algorithm for approximate inference and parameter estimation. Although two commonly used approximate inference methods, such as variational Bayes (VB) and collapsed Gibbs sampling (GS), have gained great success in learning LDA, the proposed BP is competitive in both speed and accuracy, as validated by encouraging experimental results on four large-scale document datasets. Furthermore, the BP algorithm has the potential to become a generic scheme for learning variants of LDA-based topic models in the collapsed space. To this end, we show how to learn two typical variants of LDA-based topic models, such as author-topic models (ATM) and relational topic models (RTM), using BP based on the factor graph representations.
Full Text Available Applying a Mobile Human Computer Interaction (MHCI) view to the domain of education using Mobile Learning (Mlearning), the research outlines its understanding of the influences and effects of different interactions on the use of mobile technology...
Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios
Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619
Valeria V. Gribova
Full Text Available A toolset of a medical computer learning simulator for ophthalmology with virtual reality and its implementation are considered in the paper. The simulator is oriented for professional skills training for students of medical universities.
Full Text Available Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.
Voulodimos, Athanasios; Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios
Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.
Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn
This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.
Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...
Juana Marlen Tellez Reinoso
Full Text Available To learn the treatment to image, specifically in the application Photoshop Marinates is one of the objectives in the specialty of Degree in Education, Computer Sciencie, guided to guarantee the preparation of the students as future professional, being able to reach in each citizen of our country an Integral General Culture. With that purpose a computer application is suggested, of tutorial type, entitled “Learning Treatment to Image".
Richards, Mike; Price, Blaine A.; Nuseibeh, Bashar
In this paper we present the approach adopted at the UK’s Open University for teaching computer security to large numbers of students at a distance through supported open learning. We discuss how the production of learning materials at the university has had to change to reflect the ever-increasing rate of technological, legislative and social change within the computing discipline, and how the university has had to rethink the role of the academic in the course development process. We argue ...
In this thesis, we investigate several aspects of the behaviour of liquid crystal molecules near interfaces using computer simulation. We briefly discuss experiment, theoretical and computer simulation studies of some of the liquid crystal interfaces. We then describe three essentially independent research topics. The first of these concerns extensive simulations of a liquid crystal formed by long flexible molecules. We examined the bulk behaviour of the model and its structure. Studies of a film of smectic liquid crystal surrounded by vapour were also carried out. Extensive simulations were also done for a long-molecule/short-molecule mixture, studies were then carried out to investigate the liquid-vapour interface of the mixture. Next, we report the results of large scale simulations of soft-spherocylinders of two different lengths. We examined the bulk coexistence of the nematic and isotropic phases of the model. Once the bulk coexistence behaviour was known, properties of the nematic-isotropic interface were investigated. This was done by fitting order parameter and density profiles to appropriate mathematical functions and calculating the biaxial order parameter. We briefly discuss the ordering at the interfaces and make attempts to calculate the surface tension. Finally, in our third project, we study the effects of different surface topographies on creating bistable nematic liquid crystal devices. This was carried out using a model based on the discretisation of the free energy on a lattice. We use simulation to find the lowest energy states and investigate if they are degenerate in energy. We also test our model by studying the Frederiks transition and comparing with analytical and other simulation results. (author)
Computers are now a major tool in research and development in almost all scientific and technological fields. Despite recent developments, this is far from true for learning environments in schools and most undergraduate studies. This thesis proposes a framework for designing curricula where computers, and computer modelling in particular, are a major tool for learning. The framework, based on research on learning science and mathematics and on computer user interface, assumes that: 1) learning is an active process of creating meaning from representations; 2) learning takes place in a community of practice where students learn both from their own effort and from external guidance; 3) learning is a process of becoming familiar with concepts, with links between concepts, and with representations; 4) direct manipulation user interfaces allow students to explore concrete-abstract objects such as those of physics and can be used by students with minimal computer knowledge. Physics is the science of constructing models and explanations about the physical world. And mathematical models are an important type of models that are difficult for many students. These difficulties can be rooted in the fact that most students do not have an environment where they can explore functions, differential equations and iterations as primary objects that model physical phenomena--as objects-to-think-with, reifying the formal objects of physics. The framework proposes that students should be introduced to modelling in a very early stage of learning physics and mathematics, two scientific areas that must be taught in very closely related way, as they were developed since Galileo and Newton until the beginning of our century, before the rise of overspecialisation in science. At an early stage, functions are the main type of objects used to model real phenomena, such as motions. At a later stage, rates of change and equations with rates of change play an important role. This type of equations
Pam Wright highlights the role of technology in providing situated learning opportunities for preservice teachers to explore the role commercial computer games may have in primary education. In a study designed to assess the effectiveness of an online unit on gaming incorporated into a course on learning technologies, Wright found that thoughtful…
Whitaker, James Todd
I examined the effectiveness of self-directed learning and English learning with computer applications on college students in Bangkok, Thailand, in a control-group experimental-group pretest-posttest design. The hypothesis was tested using a t test: two-sample assuming unequal variances to establish the significance of mean scores between the two…
Edwards, Thomas O.
The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…
Park, Young C.
The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…
Blake, J; Goodman, J
Games are a creative teaching strategy that enhances learning and problem solving. Gaming strategies are being used by the authors to make learning interesting, stimulating and fun. This article focuses on the development and implementation of computer games as an instructional strategy. Positive outcomes have resulted from the use of games in the classroom.
Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.
Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…
The author taught a computer graphics course through a service-learning framework to undergraduate and graduate students in the spring of 2003 at Florida State University (FSU). The students in this course participated in learning a software program along with youths from a neighboring, low-income, primarily African-American community. Together,…
Nurjanah; Dahlan, J. A.
This study is back grounded by the importance of self-regulated learning as an affective aspect that determines the success of students in learning mathematics. The purpose of this research is to see how the improvement of junior high school students' self-regulated learning through computer based learning is reviewed in whole and school level. This research used a quasi-experimental research method. This is because individual sample subjects are not randomly selected. The research design used is Pretest-and-Posttest Control Group Design. Subjects in this study were students of grade VIII junior high school in Bandung taken from high school (A) and middle school (B). The results of this study showed that the increase of the students' self-regulated learning who obtain learning with computer-based learning is higher than students who obtain conventional learning. School-level factors have a significant effect on increasing of the students' self-regulated learning.
Rosalie J. Ingram
Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.
Siu, Kin Wai Michael; Lam, Mei Seung
Although computer assisted learning (CAL) is becoming increasingly popular, people with visual impairment face greater difficulty in accessing computer-assisted learning facilities. This is primarily because most of the current CAL facilities are not visually impaired friendly. People with visual impairment also do not normally have access to…
This paper proposes a teaching method that utilizes video games in computer science education. The primary characteristic of this approach is that it utilizes video games as observational materials. The underlying idea is that by observing the computational behavior of a wide variety of video games, learners will easily grasp the fundamental architecture, theory, and technology of computers. The results of a case study conducted indicate that the method enhances the motivation of students for...
This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a
Blikstein, Paulo; Worsley, Marcelo
New high-frequency multimodal data collection technologies and machine learning analysis techniques could offer new insights into learning, especially when students have the opportunity to generate unique, personalized artifacts, such as computer programs, robots, and solutions engineering challenges. To date most of the work on learning analytics…
Gnaur, Dorina; Hüttel, Hans
This paper reports initial experiences with flipping the classroom in an undergraduate computer science course as part of an overall attempt to enhance the pedagogical support for student learning. Our findings indicate that, just as the flipped classroom implies, a shift of focus in the learning...... context influences the way students engage with the course and their learning strategies....
Lee, Cynthia; Yeung, Alexander Seeshing; Ip, Tiffany
Computer technology provides spaces and locales for language learning. However, learning style preference and demographic variables may affect the effectiveness of technology use for a desired goal. Adapting Reid's pioneering Perceptual Learning Style Preference Questionnaire (PLSPQ), this study investigated the relations of university students'…
Tormanen, Minna R. K.; Takala, Marjatta; Sajaniemi, Nina
This study examined whether audiovisual computer training without linguistic material had a remedial effect on different learning disabilities, like dyslexia and ADD (Attention Deficit Disorder). This study applied a pre-test-intervention-post-test design with students (N = 62) between the ages of 7 and 19. The computer training lasted eight weeks…
In this study, computer-based learning (CBL) was explored in the context of breastfeeding training for undergraduate Dietetic students. Aim: To adapt and validate an Indian computer-based undergraduate breastfeeding training module for use by South African undergraduate Dietetic students. Methods and materials: The ...
Nikolić Boško; Grbanović Nenad; Đorđević Jovan
The paper proposes a method of an effective distance learning of architecture and computer organization. The proposed method is based on a software system that is possible to be applied in any course in this field. Within this system students are enabled to observe simulation of already created computer systems. The system provides creation and simulation of switch systems, too.
Michel, Marije; Smith, Bryan
Though eye-tracking technology has been used in reading research for over 100 years, researchers have only recently begun to use it in studies of computer-assisted language learning (CALL). This chapter provides an overview of eye-tracking research to date, which is relevant to computer-mediated
Fais, Laurie; Wanderman, Richard
The paper describes the application of a computer-assisted writing program in a special high school for learning disabled and dyslexic students and reports on a study of the program's effectiveness. Particular advantages of the Macintosh Computer for such a program are identified including use of the mouse pointing tool, graphic icons to identify…
Wepner, Shelley B.; Cotter, Michelle
Notes that new literacies use computer graphics to tell a story, demonstrate a theory, or support a definition. Offers a functionality framework for assessing the value of computer graphics for early literacy learning. Provides ideas for determining the value of CD-ROM software and websites. Concludes that graphics that give text meaning or…
The Spastic Society Of Karnataka , Bangalore
Study conducted by The Spastics Society of Karnataka on behalf of Azim Premji Foundation to assess the effectiveness of computers in enhancing learning for children with specific learning disabilities. Azim Premji Foundation is not liable for any direct or indirect loss or damage whatsoever arising from the use or access of any information, interpretation and conclusions that may be printed in this report.; Study to assess the effectiveness of computers in enhancing learning for children with...
Shaker, Noor; Abou-Zleikha, Mohamed; Shaker, Mohammad
Learning models of player behavior has been the focus of several studies. This work is motivated by better understanding of player behavior, a knowledge that can ultimately be employed to provide player-adapted or personalized content. In this paper, we propose the use of active learning for player...... experience modeling. We use a dataset from hundreds of players playing Infinite Mario Bros. as a case study and we employ the random forest method to learn mod- els of player experience through the active learning approach. The results obtained suggest that only part of the dataset (up to half the size...... that the method can be used online during the content generation process where the mod- els can improve and better content can be presented as the game is being played....
Altintas, Tugba; Gunes, Ali; Sayan, Hamiyet
Peer learning or, as commonly expressed, peer-assisted learning (PAL) involves school students who actively assist others to learn and in turn benefit from an effective learning environment. This research was designed to support students in becoming more autonomous in their learning, help them enhance their confidence level in tackling computer…
Grys, Ben T; Lo, Dara S; Sahin, Nil; Kraus, Oren Z; Morris, Quaid; Boone, Charles; Andrews, Brenda J
With recent advances in high-throughput, automated microscopy, there has been an increased demand for effective computational strategies to analyze large-scale, image-based data. To this end, computer vision approaches have been applied to cell segmentation and feature extraction, whereas machine-learning approaches have been developed to aid in phenotypic classification and clustering of data acquired from biological images. Here, we provide an overview of the commonly used computer vision and machine-learning methods for generating and categorizing phenotypic profiles, highlighting the general biological utility of each approach. © 2017 Grys et al.
Beard, Bettina L.; Ahumada, Albert J., Jr.; Trejo, Leonard (Technical Monitor)
Our learning model has memory templates representing the target-plus-noise and noise-alone stimulus sets. The best correlating template determines the response. The correlations and the feedback participate in the additive template updating rule. The model can predict the relative thresholds for detection in random, fixed and twin noise.
Ae, Tadashi; Araki, Hiroyuki; Sakai, Keiichi
We have proposed a two-level architecture for brain computing, where two levels are introduced for processing of meta-symbol. At level 1 a conventional pattern recognition is performed, where neural computation is included, and its output gives the meta-symbol which is a symbol enlarged from a symbol to a kind of pattern. At Level 2 an algorithm acquisition is made by using a machine for abstract states. We are also developing the VLSI chips at each level for SBC (Structured Brain Computer) Ver.1.0
.... In the computer age, highly accurate models and simulations of the enemy can be created. However, including the effects of motivations, capabilities, and weaknesses of adversaries in current wars is still extremely difficult...
Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon
This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…
Shivayogi V Hiremath
Full Text Available A brain-computer interface (BCI system transforms neural activity into control signals for external devices in real time. A BCI user needs to learn to generate specific cortical activity patterns to control external devices effectively. We call this process BCI learning, and it often requires significant effort and time. Therefore, it is important to study this process and develop novel and efficient approaches to accelerate BCI learning. This article reviews major approaches that have been used for BCI learning, including computer-assisted learning, co-adaptive learning, operant conditioning, and sensory feedback. We focus on BCIs based on electrocorticography and intracortical microelectrode arrays for restoring motor function. This article also explores the possibility of brain modulation techniques in promoting BCI learning, such as electrical cortical stimulation, transcranial magnetic stimulation, and optogenetics. Furthermore, as proposed by recent BCI studies, we suggest that BCI learning is in many ways analogous to motor and cognitive skill learning, and therefore skill learning should be a useful metaphor to model BCI learning.
Hiremath, Shivayogi V; Chen, Weidong; Wang, Wei; Foldes, Stephen; Yang, Ying; Tyler-Kabara, Elizabeth C; Collinger, Jennifer L; Boninger, Michael L
A brain-computer interface (BCI) system transforms neural activity into control signals for external devices in real time. A BCI user needs to learn to generate specific cortical activity patterns to control external devices effectively. We call this process BCI learning, and it often requires significant effort and time. Therefore, it is important to study this process and develop novel and efficient approaches to accelerate BCI learning. This article reviews major approaches that have been used for BCI learning, including computer-assisted learning, co-adaptive learning, operant conditioning, and sensory feedback. We focus on BCIs based on electrocorticography and intracortical microelectrode arrays for restoring motor function. This article also explores the possibility of brain modulation techniques in promoting BCI learning, such as electrical cortical stimulation, transcranial magnetic stimulation, and optogenetics. Furthermore, as proposed by recent BCI studies, we suggest that BCI learning is in many ways analogous to motor and cognitive skill learning, and therefore skill learning should be a useful metaphor to model BCI learning.
Heift, Trude; Schulze, Mathias
"Sometimes maligned for its allegedly behaviorist connotations but critical for success in many fields from music to sport to mathematics and language learning, 'practice' is undergoing something of a revival in the applied linguistics literature" (Long & Richards 2007, p. xi). This research timeline provides a systematic overview of…
Sabti, Ahmed Abdulateef; Chaichan, Rasha Sami
This study examines the attitudes of Saudi Arabian high school students toward the use of computer technologies in learning English. The study also discusses the possible barriers that affect and limit the actual usage of computers. Quantitative approach is applied in this research, which involved 30 Saudi Arabia students of a high school in Kuala Lumpur, Malaysia. The respondents comprised 15 males and 15 females with ages between 16 years and 18 years. Two instruments, namely, Scale of Attitude toward Computer Technologies (SACT) and Barriers affecting Students' Attitudes and Use (BSAU) were used to collect data. The Technology Acceptance Model (TAM) of Davis (1989) was utilized. The analysis of the study revealed gender differences in attitudes toward the use of computer technologies in learning English. Female students showed high and positive attitudes towards the use of computer technologies in learning English than males. Both male and female participants demonstrated high and positive perception of Usefulness and perceived Ease of Use of computer technologies in learning English. Three barriers that affected and limited the use of computer technologies in learning English were identified by the participants. These barriers are skill, equipment, and motivation. Among these barriers, skill had the highest effect, whereas motivation showed the least effect.
Deniz Mertkan GEZGIN
Full Text Available Mobile learning has started to perform an increasingly significant role in improving learning outcomes in education. Successful and efficient implementation of m-learning in higher education, as with all educational levels, depends on users’ acceptance of this technology. This study focuses on investigating the attitudes of undergraduate students of Computer Engineering (CENG and Computer Education and Instructional Technology (CEIT departments in a Turkish public university towards m-learning from three perspectives; gender, area of study, and mobile device ownership. Using a correlational survey method, a Mobile Learning Attitude Scale (MLAS was administered to 531 students, analysis of which revealed a positive attitude to m-learning in general. A further investigation of the aforementioned three variables showed a more positive attitude for female students in terms of usability, for CEIT students in terms of advantages, usability and independence, and for those owning a mobile device in terms of usability. An important implication from the findings, among others, is supplementing Computer Engineering curriculum with elective courses on the fundamentals of mobile learning, and/or the design and development of m-learning software, so as to create, in the long run, more specialized and complementary teams comprised of trained CENG and CEIT graduates in m-learning sector.
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.
Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol
The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.
Full Text Available Learning analytics (LA has been applied to various learning environments, though it is quite new in the field of computer assisted language learning (CALL. This article attempts to examine the application of learning analytics in the upcoming big data age. It starts with an introduction and application of learning analytics in other fields, followed by a retrospective review of historical interaction between learning and media in CALL, and a penetrating analysis on why people would go to learning analytics to increase the efficiency of foreign language education. As approved in previous research, new technology, including big data mining and analysis, would inevitably enhance the learning of foreign languages. Potential changes that learning analytics would bring to Chinese foreign language education and researches are also presented in the article.
Joksimovic, Srecko; Hatala, Marek; Gaševic, Dragan
Teaching and learning in networked settings has attracted significant attention recently. The central topic of networked learning research is human-human and human-information interactions occurring within a networked learning environment. The nature of these interactions is highly complex and usually requires a multi-dimensional approach to…
Duong, Tuan; Duong, Vu; Suri, Ronald
A neural-network mathematical model that, relative to prior such models, places greater emphasis on some of the temporal aspects of real neural physical processes, has been proposed as a basis for massively parallel, distributed algorithms that learn dynamic models of possibly complex external processes by means of learning rules that are local in space and time. The algorithms could be made to perform such functions as recognition and prediction of words in speech and of objects depicted in video images. The approach embodied in this model is said to be "hardware-friendly" in the following sense: The algorithms would be amenable to execution by special-purpose computers implemented as very-large-scale integrated (VLSI) circuits that would operate at relatively high speeds and low power demands.
Full Text Available Political rhetoric about knowledge economies and learning societies really ought to be an educationalist's dream. Add to that the revolutionary power of electronic media, and an insatiable hunger for archaeology among the public, archaeology teaching should be well placed to flourish. In England, the Department for Culture Media and Sport has just closed its call for interest in the multi-million pound Culture Online programme. The New Opportunities Fund has already spent fifty million pounds on digitisation alone, while the Heritage Lottery Fund is encouraging heritage agencies to distribute their data sets online. The JISC continues to expand its information environment, investing most recently in virtual learning environments. In Europe, the 6th Framework promises a single European Research Area, and grid technologies allow high quantities of data to empower an information society - supported by the millions of euros already invested in 'e-content'. All of these programmes, and many many more, provide openings for archaeologists to invest in training, teaching and learning. How can archaeology, in particular staff and students in our educational establishments, take best advantage of this information-rich, digitally-empowered learning society?
Tallerico, P.J.; Rankin, J.E.
A gyrocon computer model is discussed in which the electron beam is followed from the gun output to the collector region. The initial beam may be selected either as a uniform circular beam or may be taken from the output of an electron gun simulated by the program of William Herrmannsfeldt. The fully relativistic equations of motion are then integrated numerically to follow the beam successively through a drift tunnel, a cylindrical rf beam deflection cavity, a combination drift space and magnetic bender region, and an output rf cavity. The parameters for each region are variable input data from a control file. The program calculates power losses in the cavity wall, power required by beam loading, power transferred from the beam to the output cavity fields, and electronic and overall efficiency. Space-charge effects are approximated if selected. Graphical displays of beam motions are produced. We discuss the Los Alamos Scientific Laboratory (LASL) prototype design as an example of code usage. The design shows a gyrocon of about two-thirds megawatt output at 450 MHz with up to 86% overall efficiency
Full Text Available More than two decades of work in vision posits the existence of dual-learning systems of category learning. The reflective system uses working memory to develop and test rules for classifying in an explicit fashion, while the reflexive system operates by implicitly associating perception with actions that lead to reinforcement. Dual-learning systems models hypothesize that in learning natural categories, learners initially use the reflective system and, with practice, transfer control to the reflexive system. The role of reflective and reflexive systems in auditory category learning and more specifically in speech category learning has not been systematically examined. In this article we describe a neurobiologically-constrained dual-learning systems theoretical framework that is currently being developed in speech category learning and review recent applications of this framework. Using behavioral and computational modeling approaches, we provide evidence that speech category learning is predominantly mediated by the reflexive learning system. In one application, we explore the effects of normal aging on non-speech and speech category learning. We find an age related deficit in reflective-optimal but not reflexive-optimal auditory category learning. Prominently, we find a large age-related deficit in speech learning. The computational modeling suggests that older adults are less likely to transition from simple, reflective, uni-dimensional rules to more complex, reflexive, multi-dimensional rules. In a second application we summarize a recent study examining auditory category learning in individuals with elevated depressive symptoms. We find a deficit in reflective-optimal and an enhancement in reflexive-optimal auditory category learning. Interestingly, individuals with elevated depressive symptoms also show an advantage in learning speech categories. We end with a brief summary and description of a number of future directions.
Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.
Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…
Klimova, Blanka; Kacet, Jaroslav
Information and communication technologies (ICT) have become an inseparable part of people's lives. For children the use of ICT is as natural as breathing and therefore they find the use of ICT in school education as normal as the use of textbooks. The purpose of this review study is to explore the efficacy of computer games on language learning…
Shaffer, David Williamson
This book looks at how particular video and computer games--such as "Digital Zoo", "The Pandora Project", "SodaConstructor", and more--can help teach children and students to think like doctors, lawyers, engineers, urban planners, journalists, and other professionals. In the process, new "smart games" will give them the knowledge and skills they…
Hsu, Jack Shih-Chieh; Huang, Hsieh-Hong; Linden, Lars P.
This study explores a de-bias function for a decision support systems (DSS) that is designed to help a user avoid confirmation bias by increasing the user's learning opportunities. Grounded upon the theory of mental models, the use of DSS is viewed as involving a learning process, whereby a user is directed to build mental models so as to reduce…
Aoki, Hirotaka; Hansen, John Paulin; Itoh, Kenji
that inefficient eye movements was dramatically reduced after only 15 to 25 sentences of typing, equal to approximately 3-4 hours of practice. The performance data fits a general learning model based on the power law of practice. The learning model can be used to estimate further improvements in gaze typing...
The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front-end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS cluster interactive front-end, an Amdahl VM Computing engine, ACP farms, and (primarily) VMS workstations. This paper will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. (orig.)
The goal of the current Central Computing Upgrade at Fermilab is to create a computing environment that maximizes total productivity, particularly for high energy physics analysis. The Computing Department and the Next Computer Acquisition Committee decided upon a model which includes five components: an interactive front end, a Large-Scale Scientific Computer (LSSC, a mainframe computing engine), a microprocessor farm system, a file server, and workstations. With the exception of the file server, all segments of this model are currently in production: a VAX/VMS Cluster interactive front end, an Amdahl VM computing engine, ACP farms, and (primarily) VMS workstations. This presentation will discuss the implementation of the Fermilab Central Computing Facility Architectural Model. Implications for Code Management in such a heterogeneous environment, including issues such as modularity and centrality, will be considered. Special emphasis will be placed on connectivity and communications between the front-end, LSSC, and workstations, as practiced at Fermilab. 2 figs
Mohamd Hassan Hassan; Jehad Al-Sadi
This paper introduces a new model for m- Learning context adaptation due to the need of utilizing mobile technology in education. Mobile learning; m-Learning for short; in considered to be one of the hottest topics in the educational community, many researches had been done to conceptualize this new form of learning. We are presenting a promising design for a model to adapt the learning content in mobile learning applications in order to match the learner context, preferences and the educatio...
SUMI, Seijiro; TAKEUCHI, Osamu
The purposes of the study are (a) to put the “cyclic model of learning” into practice by means of an LMS (Learning Management System) for foreign language teaching /learning, and (b) to examine how the “cyclic model of learning” influences improvement of students' English ability in both proficiency and achievement. Current major concerns of CALL (Computer Assisted Language learning) research have shifted from piecemeal and experimental tests of the use of technology in a single computer lab ...
Full Text Available Based on that the computer will be infected by infected computer and exposed computer, and some of the computers which are in suscepitible status and exposed status can get immunity by antivirus ability, a novel coumputer virus model is established. The dynamic behaviors of this model are investigated. First, the basic reproduction number R0, which is a threshold of the computer virus spreading in internet, is determined. Second, this model has a virus-free equilibrium P0, which means that the infected part of the computer disappears, and the virus dies out, and P0 is a globally asymptotically stable equilibrium if R01 then this model has only one viral equilibrium P*, which means that the computer persists at a constant endemic level, and P* is also globally asymptotically stable. Finally, some numerical examples are given to demonstrate the analytical results.
Schuetze, Hans G.
To answer the question "Financing what?" this article distinguishes several models of lifelong learning as well as a variety of lifelong learning activities. Several financing methods are briefly reviewed, however the principal focus is on Individual Learning Accounts (ILAs) which were seen by some analysts as a promising model for…
Cooper, Terry L.; Sundeen, Richard
The urban studies learning model described in this article was found to increase students' self-esteem, imbue a more flexible and open perspective, contribute to the capacity for self-direction, produce increases on the feeling reactivity, spontaneity, and acceptance of aggression scales, and expand interpersonal competence. (Author/WI)
Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting
Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.
Frost, Mary E; Derby, Dustin C; Haan, Andrea G
Objective : Changes in small business and insurance present challenges for newly graduated chiropractors. Technology that reaches identified, diverse learning styles may assist the chiropractic student in business classes to meet course outcomes better. Thus, the purpose of our study is to determine if the use of technology-based instructional aids enhance students' mastery of course learning outcomes. Methods : Using convenience sampling, 86 students completed a survey assessing course learning outcomes, learning style, and the helpfulness of lecture and computer-assisted learning related to content mastery. Quantitative analyses occurred. Results : Although respondents reported not finding the computer-assisted learning as helpful as the lecture, significant relationships were found between pre- and post-assisted learning measures of the learning outcomes 1 and 2 for the visual and kinesthetic groups. Surprisingly, however, all learning style groups exhibited significant pre- and post-assisted learning appraisal relationships with learning outcomes 3 and 4. Conclusion : While evidence exists within the current study of a relationship between students' learning of the course content corollary to the use of technologic instructional aids, the exact nature of the relationship remains unclear.
Besides the big LHC experiments a number of mid-size experiments is coming online which need to define new computing models to meet the demands on processing and storage requirements of those experiments. We present the hybrid computing model of IceCube which leverages GRID models with a more flexible direct user model as an example of a possible solution. In IceCube a central datacenter at UW-Madison servers as Tier-0 with a single Tier-1 datacenter at DESY Zeuthen. We describe the setup of the IceCube computing infrastructure and report on our experience in successfully provisioning the IceCube computing needs.
Young, J Z
The memory mechanisms of cephalopods consist of a series of matrices of intersecting axes, which find associations between the signals of input events and their consequences. The tactile memory is distributed among eight such matrices, and there is also some suboesophageal learning capacity. The visual memory lies in the optic lobe and four matrices, with some re-exciting pathways. In both systems, damage to any part reduces proportionally the effectiveness of the whole memory. These matrices are somewhat like those in mammals, for instance those in the hippocampus. The first matrix in both visual and tactile systems receives signals of vision and taste, and its output serves to increase the tendency to attack or to take with the arms. The second matrix provides for the correlation of groups of signals on its neurons, which pass signals to the third matrix. Here large cells find clusters in the sets of signals. Their output re-excites those of the first lobe, unless pain occurs. In that case, this set of cells provides a record that ensures retreat. There is experimental evidence that these distributed memory systems allow for the identification of categories of visual and tactile inputs, for generalization, and for decision on appropriate behavior in the light of experience. The evidence suggests that learning in cephalopods is not localized to certain layers or "grandmother cells" but is distributed with high redundance in serial networks, with recurrent circuits.
Kourdioukova, Elena V.; Verstraete, Koenraad L.; Valcke, Martin
Objective: The aim of this research was to explore (1) clinical years students' perceptions about radiology case-based learning within a computer supported collaborative learning (CSCL) setting, (2) an analysis of the collaborative learning process, and (3) the learning impact of collaborative work on the radiology cases. Methods: The first part of this study focuses on a more detailed analysis of a survey study about CSCL based case-based learning, set up in the context of a broader radiology curriculum innovation. The second part centers on a qualitative and quantitative analysis of 52 online collaborative learning discussions from 5th year and nearly graduating medical students. The collaborative work was based on 26 radiology cases regarding musculoskeletal radiology. Results: The analysis of perceptions about collaborative learning on radiology cases reflects a rather neutral attitude that also does not differ significantly in students of different grade levels. Less advanced students are more positive about CSCL as compared to last year students. Outcome evaluation shows a significantly higher level of accuracy in identification of radiology key structures and in radiology diagnosis as well as in linking the radiological signs with available clinical information in nearly graduated students. No significant differences between different grade levels were found in accuracy of using medical terminology. Conclusion: Students appreciate computer supported collaborative learning settings when tackling radiology case-based learning. Scripted computer supported collaborative learning groups proved to be useful for both 5th and 7th year students in view of developing components of their radiology diagnostic approaches.
Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik
In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.
Musa, Sarhan M
This reference offers tools for engineers, scientists, biologists, and others working with the computational techniques of nanophotonics. It introduces the key concepts of computational methods in a manner that is easily digestible for newcomers to the field. The book also examines future applications of nanophotonics in the technical industry and covers new developments and interdisciplinary research in engineering, science, and medicine. It provides an overview of the key computational nanophotonics and describes the technologies with an emphasis on how they work and their key benefits.
Caropreso, Edward J.; Haggerty, Mark
Describes an alternative approach to introductory economics based on a cooperative learning model, "Learning Together." Discussion of issues in economics education and cooperative learning in higher education leads to explanation of how to adapt the Learning Together Model to lesson planning in economics. A flow chart illustrates the process for a…
Mulder, Y.G.; Bollen, Lars; de Jong, Anthonius J.M.
Dynamic phenomena are common in science education. Students can learn about such system dynamic processes through model based learning activities. This paper describes a study on the effects of a learning from erroneous models approach using the learning environment SCYDynamics. The study compared
Hou, Chun-Ju; Chen, Yen-Ting; Hu, Ling-Chen; Chuang, Chih-Chieh; Chiu, Yu-Hsien; Tsai, Ming-Shih
Pulmonary auscultation is a physical assessment skill learned by nursing students for examining the respiratory system. Generally, a sound simulator equipped mannequin is used to group teach auscultation techniques via classroom demonstration. However, nursing students cannot readily duplicate this learning environment for self-study. The advancement of electronic and digital signal processing technologies facilitates simulating this learning environment. This study aims to develop a computer-aided auscultation learning system for assisting teachers and nursing students in auscultation teaching and learning. This system provides teachers with signal recording and processing of lung sounds and immediate playback of lung sounds for students. A graphical user interface allows teachers to control the measuring device, draw lung sound waveforms, highlight lung sound segments of interest, and include descriptive text. Effects on learning lung sound auscultation were evaluated for verifying the feasibility of the system. Fifteen nursing students voluntarily participated in the repeated experiment. The results of a paired t test showed that auscultative abilities of the students were significantly improved by using the computer-aided auscultation learning system.
Siebenhuener, Bernd; Barth, Volker
In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes
Barbosa, Jorge; Barbosa, Debora; Rabello, Solon
Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…
Michelsen, Anders Ib
the mid-20th century of a paradoxical distinction/complicity between the technical organisation of computed function and the human Being, in the sense of creative action upon such function. This paradoxical distinction/complicity promotes a chiastic (Merleau-Ponty) relationship of extension of one......This article treats the philosophical underpinnings of the notions of ubiquity and pervasive computing from a historical perspective. The current focus on these notions reflects the ever increasing impact of new media and the underlying complexity of computed function in the broad sense of ICT...... that have spread vertiginiously since Mark Weiser coined the term ‘pervasive’, e.g., digitalised sensoring, monitoring, effectuation, intelligence, and display. Whereas Weiser’s original perspective may seem fulfilled since computing is everywhere, in his and Seely Brown’s (1997) terms, ‘invisible...
Yu, Shengquan; Yang, Xianmin; Cheng, Gang; Wang, Minjuan
This paper presents a new model for organizing learning resources: Learning Cell. This model is open, evolving, cohesive, social, and context-aware. By introducing a time dimension into the organization of learning resources, Learning Cell supports the dynamic evolution of learning resources while they are being used. In addition, by introducing a…
Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf
Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).
Dillenbourg, Pierre, Ed.
Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2)…
Full Text Available This qualitative ethnographic study examines a collaborative leadership model focused on learning and socially just practices within a change context of a wide educational partnership. The study analyzes a range of perspectives of novice teachers, mentor teachers, teacher educators and district superintendents on leadership and learning. The findings reveal the emergence of a coalition of leaders crossing borders at all levels of the educational system: local school level, district level and teacher education level who were involved in coterminous collaborative learning. Four categories of learning were identified as critical to leading a change in the educational system: learning in professional communities, learning from practice, learning through theory and research and learning from and with leaders. The implications of the study for policy makers as well as for practitioners are to adopt a holistic approach to the educational environment and plan a collaborative learning continuum from initial pre-service programs through professional development learning at all levels.
This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…
Yuan, Jiugen; Xing, Ruonan
With the development of computer education software, digital educational game has become an important part in our life, entertainment and education. Therefore how to make full use of digital game's teaching functions and educate through entertainment has become the focus of current research. The thesis make a connection between educational game and collaborative learning, the current popular teaching model, and concludes digital game-based collaborative learning model combined with teaching practice.
Kennedy, Marc C.; Anderson, Clive W.; Conti, Stefano; O'Hagan, Anthony
In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction. The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics
Full Text Available A model is proposed to characterize the type of knowledge acquired in Artificial Grammar Learning (AGL. In particular, Shannon entropy is employed to compute the complexity of different test items in an AGL task, relative to the training items. According to this model, the more predictable a test item is from the training items, the more likely it is that this item should be selected as compatible with the training items. The predictions of the entropy model are explored in relation to the results from several previous AGL datasets and compared to other AGL measures. This particular approach in AGL resonates well with similar models in categorization and reasoning which also postulate that cognitive processing is geared towards the reduction of entropy.
Wang, P.; Cheng, B. N.; Chao, Y.
Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.
This thesis applies computational intelligence to the field of actuarial (insurance) science. In particular, this thesis deals with life insurance where mortality modelling is important. Actuaries use ancient models (mortality laws) from the nineteenth century, for example Gompertz' and Makeham's
The dissertation explores the potential of rhetorical theories for understanding, analyzing, or planning communication and learning processes, and for integrating the digitized contexts and human interaction and communication proccesses in a single theoretical framework. Based on Cicero's rhetori...... applied to two empirical case studies of Master programs, the dissertation develops and presents a new theory on Computer Supported Collaborative Learning (CSCL).......The dissertation explores the potential of rhetorical theories for understanding, analyzing, or planning communication and learning processes, and for integrating the digitized contexts and human interaction and communication proccesses in a single theoretical framework. Based on Cicero's rhetoric...
Wu, Zhaohui; Zheng, Nenggan; Zhang, Shaowu; Zheng, Xiaoxiang; Gao, Liqiang; Su, Lijuan
The combination of biological and artificial intelligence is particularly driven by two major strands of research: one involves the control of mechanical, usually prosthetic, devices by conscious biological subjects, whereas the other involves the control of animal behaviour by stimulating nervous systems electrically or optically. However, to our knowledge, no study has demonstrated that spatial learning in a computer-based system can affect the learning and decision making behaviour of the biological component, namely a rat, when these two types of intelligence are wired together to form a new intelligent entity. Here, we show how rule operations conducted by computing components contribute to a novel hybrid brain-computer system, i.e., ratbots, exhibit superior learning abilities in a maze learning task, even when their vision and whisker sensation were blocked. We anticipate that our study will encourage other researchers to investigate combinations of various rule operations and other artificial intelligence algorithms with the learning and memory processes of organic brains to develop more powerful cyborg intelligence systems. Our results potentially have profound implications for a variety of applications in intelligent systems and neural rehabilitation.
Wu, Zhaohui; Zheng, Nenggan; Zhang, Shaowu; Zheng, Xiaoxiang; Gao, Liqiang; Su, Lijuan
The combination of biological and artificial intelligence is particularly driven by two major strands of research: one involves the control of mechanical, usually prosthetic, devices by conscious biological subjects, whereas the other involves the control of animal behaviour by stimulating nervous systems electrically or optically. However, to our knowledge, no study has demonstrated that spatial learning in a computer-based system can affect the learning and decision making behaviour of the biological component, namely a rat, when these two types of intelligence are wired together to form a new intelligent entity. Here, we show how rule operations conducted by computing components contribute to a novel hybrid brain-computer system, i.e., ratbots, exhibit superior learning abilities in a maze learning task, even when their vision and whisker sensation were blocked. We anticipate that our study will encourage other researchers to investigate combinations of various rule operations and other artificial intelligence algorithms with the learning and memory processes of organic brains to develop more powerful cyborg intelligence systems. Our results potentially have profound implications for a variety of applications in intelligent systems and neural rehabilitation.
Provenzo, Eugene F.
Since the 1960s, the rapid evolution of technology has created a new cultural geography--a virtual geography. "The Difference Engine: Computing, Knowledge and the Transformation of Learning" offers a conscious critique of this change and its effects on contemporary culture and education. This engaging text assumes that we are at a critical…
Bayerlein, Leopold; Jeske, Debora
Purpose: The purpose of this paper is to provide a student learning outcome focussed assessment of the benefits and limitations of traditional internships, e-internships, and simulated internships to evaluate the potential of computer-mediated internships (CMIs) (e-internships and simulated internships) within higher education from a student…
The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…
Genc, Gulten; Aydin, Selami
The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…
The findings presented in this article were derived from a 3-year study aimed at examining issues associated with the use of computers for secondary mathematics learning in Victorian (Australia) schools. Gender and other equity factors were of particular interest. In this article, the focus is on the participating mathematics teachers. Data on…
The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish
Denton, David W.
Cloud computing technologies, such as Google Docs and Microsoft Office Live, have the potential to enhance instructional methods predicated on constructivism and cooperative learning. Cloud-based application features like file sharing and online publishing are prompting departments of education across the nation to adopt these technologies.…
Janssen, J.J.H.M.; Bodermer, D.
Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members’ activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to
McDougall, Anne; Lowe, Jenny; Hopkins, Josie
This paper outlines the establishment and running of an after-school Computer Clubhouse, describing aspects of the leadership, mentoring and learning activities undertaken there. Research data has been collected from examination of documents associated with the Clubhouse, interviews with its founders, Director, session leaders and mentors, and…
Hu, Jinyan; Liu, Zhongxia; Li, Yuanyuan; Lu, Jianheng; Zhang, Lili
Currently, the construction of information technology textbooks in the primary and middle schools is an important content of the information technology curriculum reform. The article expect to have any inspire and reference on inland China school information technology teaching material construction and development through the analyzing and refining the characteristics of the Hong Kong quality textbook series - "I learn . elementary school computer cognitive curriculum".
Describes the process for developing tutorials for computer-aided learning (CAL) using a programing language rather than an authoring system. The workstation used is described, the use of graphics is discussed, the role of a local area network (LAN) is explained, and future plans are discussed. (five references) (LRW)
McNeil, Linda M.
The Crosby Independent School District near Houston, Texas, planned to introduce microcomputer instruction to its nearly 3,000 students in slow stages that had teachers and students learning at the same time. The initial impetus for computers came from an administrator who found useful information at a Regional Service Center of the State…
Karasavvidis, I.; Karasavvidis, I.; Pieters, Julius Marie; Plomp, T.
Even though it has been established that the incorporation of computers into the teaching and learning process enhances student performance, the underlying mechanisms through which this is accomplished have been largely unexplored. The present study aims to shed light on this issue. Two groups of 10
Alakeel, Ali M.
Learning computer programming is one of the main requirements of many educational study plans in higher education. Research has shown that many students face difficulties acquiring reasonable programming skills during their first year of college. In Saudi Arabia, there are twenty-three state-owned universities scattered around the country that…
Pani, John R.; Chariker, Julia H.; Naaz, Farah; Mattingly, William; Roberts, Joshua; Sephton, Sandra E.
Instruction of neuroanatomy depends on graphical representation and extended self-study. As a consequence, computer-based learning environments that incorporate interactive graphics should facilitate instruction in this area. The present study evaluated such a system in the undergraduate neuroscience classroom. The system used the method of…
Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling
Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.
Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...
Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay
A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...
Franciosi, Stephan J.
In theory, computer game-based learning can support several vocabulary learning affordances that have been identified in the foreign language learning research. In the observable evidence, learning with computer games has been shown to improve performance on vocabulary recall tests. However, while simple recall can be a sign of learning,…
Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter
The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…
Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.
The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =
Falcone, Rossella; Brunamonti, Emiliano; Genovesio, Aldo
We examined whether monkeys can learn by observing a human model, through vicarious learning. Two monkeys observed a human model demonstrating an object-reward association and consuming food found underneath an object. The monkeys observed human models as they solved more than 30 learning problems. For each problem, the human models made a choice between two objects, one of which concealed a piece of apple. In the test phase afterwards, the monkeys made a choice of their own. Learning was app...
Francisco Andrés Candelas
Full Text Available This paper introduces a new set of compact interactive simulations developed for the constructive learning of computer networks concepts. These simulations, which compose a virtual laboratory implemented as portable Java applets, have been created by combining EJS (Easy Java Simulations with the KivaNS API. Furthermore, in this work, the skills and motivation level acquired by the students are evaluated and measured when these simulations are combined with Moodle and SCORM (Sharable Content Object Reference Model documents. This study has been developed to improve and stimulate the autonomous constructive learning in addition to provide timetable flexibility for a Computer Networks subject.
Full Text Available This paper addresses an indispensable skill using a unique method to teach a critical component: helping children learn to read by using computer-assisted oral reading to help children learn vocabulary. We build on Project LISTENs Reading Tutor, a computer program that adapts automatic speech recognition to listen to children read aloud, and helps them learn to read (http://www.cs.cmu.edu/~listen. To learn a word from reading with the Reading Tutor, students must encounter the word and learn the meaning of the word in context. We modified the Reading Tutor first to help students encounter new words and then to help them learn the meanings of new words. We then compared the Reading Tutor to classroom instruction and to human-assisted oral reading as part of a yearlong study with 144 second and third graders. The result: Second graders did about the same on word comprehension in all three conditions. However, third graders who read with the 1999 Reading Tutor, modified as described in this paper, performed statistically significantly better than other third graders in a classroom control on word comprehension gains and even comparably with other third graders who read one-on-one with human tutors.
Marshall, Neil; Buteau, Chantal
As part of their undergraduate mathematics curriculum, students at Brock University learn to create and use computer-based tools with dynamic, visual interfaces, called Exploratory Objects, developed for the purpose of conducting pure or applied mathematical investigations. A student's Development Process Model of creating and using an Exploratory…
Gandole, Y. B.
Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…
Choi, Sung-Kwon; Kwon, Oh-Woog; Kim, Young-Kil; Lee, Yunkeun
In order to use dialogue systems for computer assisted second-language learning systems, one of the difficult issues in such systems is how to construct large-scale dialogue knowledge that matches the dialogue modelling of a dialogue system. This paper describes how we have accomplished the short-term construction of large-scale and…
where x increases from zero to N, the saturation value. Box 1. Matrix Meth- ... such as Laplace transforms and non-linear differential equa- tions with .... atomic bomb project in the. US in the early ... his work on game theory and computers.
Fernandes , Emmanuel; Madhour , Hend; Wentland Forte , Maia; Miniaoui , Sami
We try in this paper to propose a domain model for both author's and learner's needs concerning learning objects reuse. First of all, we present four key criteria for an efficient authoring tool: adaptive level of granularity, flexibility, integration and interoperability. Secondly, we introduce and describe our six-level Semantic Learning Model (SLM) designed to facilitate multi-level reuse of learning materials and search by defining a multi-layer model for metadata. Finally, after mapping ...
Morita, Junya; Miwa, Kazuhisa
In this paper, we propose a novel usage for computational cognitive models. In cognitive science, computational models have played a critical role of theories for human cognitions. Many computational models have simulated results of controlled psychological experiments successfully. However, there have been only a few attempts to apply the models to complex realistic phenomena. We call such a situation ``open-ended situation''. In this study, MAC/FAC (``many are called, but few are chosen''), proposed by [Forbus 95], that models two stages of analogical reasoning was applied to our open-ended psychological experiment. In our experiment, subjects were presented a cue story, and retrieved cases that had been learned in their everyday life. Following this, they rated inferential soundness (goodness as analogy) of each retrieved case. For each retrieved case, we computed two kinds of similarity scores (content vectors/structural evaluation scores) using the algorithms of the MAC/FAC. As a result, the computed content vectors explained the overall retrieval of cases well, whereas the structural evaluation scores had a strong relation to the rated scores. These results support the MAC/FAC's theoretical assumption - different similarities are involved on the two stages of analogical reasoning. Our study is an attempt to use a computational model as an analysis device for open-ended human cognitions.
Engel, Susan; Pallas, Josh; Lambert, Sarah
This article demonstrates that the purposeful subject design, incorporating a Model United Nations (MUN), facilitated deep learning and professional skills attainment in the field of International Relations. Deep learning was promoted in subject design by linking learning objectives to Anderson and Krathwohl's (2001) four levels of knowledge or…
Hansen, John Paulin; Nielsen, Finn Ravnsbjerg; Rasmussen, Jens
The paper presents some theoretical assumptions about the cognitive control mechanisms of subjects learning to play a computer game. A simulation model has been developed to investigate these assumptions. The model is an automaton, reacting to instruction-like cue action rules. The prototypical...... performances of 23 experimental subjects at succeeding levels of training are compared to the performance of the model. The findings are interpreted in terms of a general taxonomy for cognitive task analysis....
Linda D. Grooms
Full Text Available The axiom of humanity’s basic need to communicate provides the impetus to explore the nature and quality of computer-mediated communication as a vehicle for learning in higher education. This exploratory study examined the experiential communication perceptions of online doctoral students during the infancy of their program. Eighty-five students were electronically queried through a 32 item open-ended questionnaire within a 13 day time frame. Preliminary findings supported the experience of Seagren and Watwood (1996 at the Lincoln Campus of the University of Nebraska, that “more information widens learning opportunities, but without interaction, learning is not enhanced” (p. 514. The overarching implications stress that faculty development and instructional planning are essential for the effective delivery of online courses, and even more so when collaborative learning is used. Facilitating group communication and interaction are areas beckoning attention as we continue to effectively organize the online classroom of this new millennium.
Tan, Kean Ming; London, Palma; Mohan, Karthik; Lee, Su-In; Fazel, Maryam; Witten, Daniela
We consider the problem of learning a high-dimensional graphical model in which there are a few hub nodes that are densely-connected to many other nodes. Many authors have studied the use of an ℓ 1 penalty in order to learn a sparse graph in the high-dimensional setting. However, the ℓ 1 penalty implicitly assumes that each edge is equally likely and independent of all other edges. We propose a general framework to accommodate more realistic networks with hub nodes, using a convex formulation that involves a row-column overlap norm penalty. We apply this general framework to three widely-used probabilistic graphical models: the Gaussian graphical model, the covariance graph model, and the binary Ising model. An alternating direction method of multipliers algorithm is used to solve the corresponding convex optimization problems. On synthetic data, we demonstrate that our proposed framework outperforms competitors that do not explicitly model hub nodes. We illustrate our proposal on a webpage data set and a gene expression data set.
Heift, Trude; Schulze, Mathias
This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…
Winarno, Sri; Muthu, Kalaiarasi Sonai; Ling, Lew Sook
This study presents students' feedback and learning impact on design and development of a multimedia learning in Direct Problem-Based Learning approach (mDPBL) for Computer Networks in Dian Nuswantoro University, Indonesia. This study examined the usefulness, contents and navigation of the multimedia learning as well as learning impacts towards…
Schwing, James L.; Olariu, Stephen
The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.
Cameron, Ian; Gani, Rafiqul
The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...
Chang, Victor; Bacigalupo, David; Wills, Gary; De Roure, David
This paper reviews current cloud computing business models and presents proposals on how organisations can achieve sustainability by adopting appropriate models. We classify cloud computing business models into eight types: (1) Service Provider and Service Orientation; (2) Support and Services Contracts; (3) In-House Private Clouds; (4) All-In-One Enterprise Cloud; (5) One-Stop Resources and Services; (6) Government funding; (7) Venture Capitals; and (8) Entertainment and Social Networking. U...
McDowell, J J
Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied o...
Mazur, A.B.; Kotlyarov, V.T.; Ermolenko, A.I.; Podbereznyj, S.S.; Postil, S.D.; Shaptala, D.V.
A partial computer model of the 'Ukrytie' object was created with the use of geoinformation technologies. The computer model makes it possible to carry out information support of the works related to the 'Ukrytie' object stabilization and its conversion into ecologically safe system for analyzing, forecasting and controlling the processes occurring in the 'Ukrytie' object. Elements and structures of the 'Ukryttia' object were designed and input into the model
Full Text Available The effectiveness of a learning depends on four main elements, they are content, desired learning outcome, instructional method and the delivery media. The integration of those four elements can be manifested into a learning modul which is called multimedia learning or learning by using multimedia. In learning context by using computer-based multimedia, there are two main things that need to be noticed so that the learning process can run effectively: how the content is presented, and what the learner’s chosen way in accepting and processing the information into a meaningful knowledge. First it is related with the way to visualize the content and how people learn. The second one is related with the learning style of the learner. This research aims to investigate the effect of the type of visualization—static vs animated—on a multimedia computer-based learning, and learning styles—visual vs verbal, towards the students’ capability in applying the concepts, procedures, principles of Java programming. Visualization type act as independent variables, and learning styles of the students act as a moderator variable. Moreover, the instructional strategies followed the Component Display Theory of Merril, and the format of presentation of multimedia followed the Seven Principles of Multimedia Learning of Mayer and Moreno. Learning with the multimedia computer-based learning has been done in the classroom. The subject of this research was the student of STMIK-STIKOM Bali in odd semester 2016-2017 which followed the course of Java programming. The Design experiments used multivariate analysis of variance, MANOVA 2 x 2, with a large sample of 138 students in 4 classes. Based on the results of the analysis, it can be concluded that the animation in multimedia interactive learning gave a positive effect in improving students’ learning outcomes, particularly in the applying the concepts, procedures, and principles of Java programming. The
Falcone, Rossella; Brunamonti, Emiliano; Genovesio, Aldo
We examined whether monkeys can learn by observing a human model, through vicarious learning. Two monkeys observed a human model demonstrating an object-reward association and consuming food found underneath an object. The monkeys observed human models as they solved more than 30 learning problems. For each problem, the human models made a choice between two objects, one of which concealed a piece of apple. In the test phase afterwards, the monkeys made a choice of their own. Learning was apparent from the first trial of the test phase, confirming the ability of monkeys to learn by vicarious observation of human models.
Full Text Available We examined whether monkeys can learn by observing a human model, through vicarious learning. Two monkeys observed a human model demonstrating an object-reward association and consuming food found underneath an object. The monkeys observed human models as they solved more than 30 learning problems. For each problem, the human models made a choice between two objects, one of which concealed a piece of apple. In the test phase afterwards, the monkeys made a choice of their own. Learning was apparent from the first trial of the test phase, confirming the ability of monkeys to learn by vicarious observation of human models.
Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process. This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples. This book is intended for graduate stude...
Full Text Available This study proposes a percussion learning device that combines tablet computers and robots. This device comprises two systems: a rhythm teaching system, in which users can compose and practice rhythms by using a tablet computer, and a robot performance system. First, teachers compose the rhythm training contents on the tablet computer. Then, the learners practice these percussion exercises by using the tablet computer and a small drum set. The teaching system provides a new and user-friendly score editing interface for composing a rhythm exercise. It also provides a rhythm rating function to facilitate percussion training for children and improve the stability of rhythmic beating. To encourage children to practice percussion exercises, a robotic performance system is used to interact with the children; this system can perform percussion exercises for students to listen to and then help them practice the exercise. This interaction enhances children’s interest and motivation to learn and practice rhythm exercises. The results of experimental course and field trials reveal that the proposed system not only increases students’ interest and efficiency in learning but also helps them in understanding musical rhythms through interaction and composing simple rhythms.
Bolander, Thomas; Gierasimczuk, Nina
In this article we study learnability of fully observable, universally applicable action models of dynamic epistemic logic. We introduce a framework for actions seen as sets of transitions between propositional states and we relate them to their dynamic epistemic logic representations as action...... in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while arbitrary (non-deterministic) actions require more learning power—they are identifiable in the limit. We then move on to a particular learning method, i.e. learning via update......, which proceeds via restriction of a space of events within a learning-specific action model. We show how this method can be adapted to learn conditional and unconditional deterministic action models. We propose update learning mechanisms for the afore mentioned classes of actions and analyse...
Full Text Available A multi-agent model is proposed in which learning styles and a word analysis technique to create a learning object recommendation system are used. On the basis of a learning style-based design, a concept map combination model is proposed to filter out unsuitable learning concepts from a given course. Our learner model classifies learners into eight styles and implements compatible computational methods consisting of three recommendations: i non-personalised, ii preferred feature-based, and iii neighbour-based collaborative filtering. The analysis of preference error (PE was performed by comparing the actual preferred learning object with the predicted one. In our experiments, the feature-based recommendation algorithm has the fewest PE.
Henkel, Ron; Endler, Lukas; Peters, Andre; Le Novère, Nicolas; Waltemath, Dagmar
The study of biological systems demands computational support. If targeting a biological problem, the reuse of existing computational models can save time and effort. Deciding for potentially suitable models, however, becomes more challenging with the increasing number of computational models available, and even more when considering the models' growing complexity. Firstly, among a set of potential model candidates it is difficult to decide for the model that best suits ones needs. Secondly, it is hard to grasp the nature of an unknown model listed in a search result set, and to judge how well it fits for the particular problem one has in mind. Here we present an improved search approach for computational models of biological processes. It is based on existing retrieval and ranking methods from Information Retrieval. The approach incorporates annotations suggested by MIRIAM, and additional meta-information. It is now part of the search engine of BioModels Database, a standard repository for computational models. The introduced concept and implementation are, to our knowledge, the first application of Information Retrieval techniques on model search in Computational Systems Biology. Using the example of BioModels Database, it was shown that the approach is feasible and extends the current possibilities to search for relevant models. The advantages of our system over existing solutions are that we incorporate a rich set of meta-information, and that we provide the user with a relevance ranking of the models found for a query. Better search capabilities in model databases are expected to have a positive effect on the reuse of existing models.
Computational modelling of expressive music performance has been widely studied in the past. While previous work in this area has been mainly focused on classical piano music, there has been very little work on guitar music, and such work has focused on monophonic guitar playing. In this work, we present a machine learning approach to automatically generate expressive performances from non expressive music scores for polyphonic guitar. We treated guitar as an hexaphonic instrument, obtaining ...
Salakhutdinov, Ruslan; Tenenbaum, Joshua B; Torralba, Antonio
We introduce HD (or “Hierarchical-Deep”) models, a new compositional learning architecture that integrates deep learning models with structured hierarchical Bayesian (HB) models. Specifically, we show how we can learn a hierarchical Dirichlet process (HDP) prior over the activities of the top-level features in a deep Boltzmann machine (DBM). This compound HDP-DBM model learns to learn novel concepts from very few training example by learning low-level generic features, high-level features that capture correlations among low-level features, and a category hierarchy for sharing priors over the high-level features that are typical of different kinds of concepts. We present efficient learning and inference algorithms for the HDP-DBM model and show that it is able to learn new concepts from very few examples on CIFAR-100 object recognition, handwritten character recognition, and human motion capture datasets.
Thiessen, Erik D
Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik
Merriam, M. L.; Rasky, D.
Since the space shuttle began flying in 1981, NASA has made a number of attempts to advance the state of the art in space transportation. In spite of billions of dollars invested, and several concerted attempts, no replacement for the shuttle is expected before 2010. Furthermore, the cost of access to space has dropped very slowly over the last two decades. On the other hand, the same two decades have seen dramatic progress in the computer industry. Computational speeds have increased by about a factor of 1000 and available memory, disk space, and network bandwidth has seen similar increases. At the same time, the cost of computing has dropped by about a factor of 10000. Is the space transportation problem simply harder? Or is there something to be learned from the computer industry? In looking for the answers, this paper reviews the early history of NASA's experience with supercomputers and NASA's visionary course change in supercomputer procurement strategy.
Full Text Available With the development of Chinese economy, oversea Chinese education has been paid more and more attention. However, the overseas Chinese education resource is relatively lack because of historical reasons, which hindered further development . How to better share the Chinese education resources and provide intelligent personalized information service for overseas student is a key problem to be solved. In recent years, the rise of cloud computing provides us an opportunity to realize intelligent learning mode. Cloud computing offers some advantages by allowing users to use infrastructure, platforms and software . In this paper we proposed an intelligent cloud learning model based on cloud computing. The learning model can utilize network resources sufficiently to implement resource sharing according to the personal needs of students, and provide a good practicability for online overseas Chinese education.
Kjeldsen, Tinne Hoff; Blomhøj, Morten
Ten years of experience with analyses of students’ learning in a modelling course for first year university students, led us to see modelling as a didactical activity with the dual goal of developing students’ modelling competency and enhancing their conceptual learning of mathematical concepts i...... create and help overcome hidden cognitive conflicts in students’ understanding; that reflections within modelling can play an important role for the students’ learning of mathematics. These findings are illustrated with a modelling project concerning the world population....
Othman, Mahfudzah; Othman, Muhaini
This paper discusses the proposed model of the collaborative virtual learning system for the introductory computer programming course which uses one of the collaborative learning techniques known as the "Think-Pair-Share". The main objective of this study is to design a model for an online learning system that facilitates the…
Pataskar, Abhijeet; Tiwari, Vijay K
Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.
Practical applications and examples highlight this treatment of computational modeling for handling complex flowfields. A reference for researchers and graduate students of many different backgrounds, it also functions as a text for learning essential computation elements.Drawing upon his own research, the author addresses both macroscopic and microscopic features. He begins his three-part treatment with a survey of the basic concepts of finite difference schemes for solving parabolic, elliptic, and hyperbolic partial differential equations. The second part concerns issues related to computati
education will be delivered to the current and future force. This thesis examined the salient areas proposed by the ALM and its impact on P2P learning ...The Army Learning Model is the new educational model that develops adaptive leaders in an era of persistent conflict. Life-long, individual
Recent years many universities are involved in development of Massive Open Online Courses (MOOCs). Unfortunately an appropriate didactic model for cooperated network learning is lacking. In this paper we introduce inquiry based learning as didactic model. Students are assumed to ask themselves
Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users\\' intuition about model similarity, and to support complex model searches in databases.
Waltemath, Dagmar; Henkel, Ron; Hoehndorf, Robert; Kacprowski, Tim; Knuepfer, Christian; Liebermeister, Wolfram
Computational models used in biology are rapidly increasing in complexity, size, and numbers. To build such large models, researchers need to rely on software tools for model retrieval, model combination, and version control. These tools need to be able to quantify the differences and similarities between computational models. However, depending on the specific application, the notion of similarity may greatly vary. A general notion of model similarity, applicable to various types of models, is still missing. Here, we introduce a general notion of quantitative model similarities, survey the use of existing model comparison methods in model building and management, and discuss potential applications of model comparison. To frame model comparison as a general problem, we describe a theoretical approach to defining and computing similarities based on different model aspects. Potentially relevant aspects of a model comprise its references to biological entities, network structure, mathematical equations and parameters, and dynamic behaviour. Future similarity measures could combine these model aspects in flexible, problem-specific ways in order to mimic users' intuition about model similarity, and to support complex model searches in databases.
Anderson, Janice Lyn
In recent years educators have begun to explore how to purposively design computer/video games to support student learning. This interest in video games has arisen in part because educational video games appear to have the potential to improve student motivation and interest in technology, and engage students in learning through the use of a familiar medium (Squire, 2005; Shaffer, 2006; Gee, 2005). The purpose of this dissertation research is to specifically address the issue of student learning through the use of educational computer/video games. Using the Quest Atlantis computer game, this study involved a mixed model research strategy that allowed for both broad understandings of classroom practices and specific analysis of outcomes through the themes that emerged from the case studies of the gendered groups using the game. Specifically, this study examined how fifth-grade students learning about science concepts, such as water quality and ecosystems, unfolds over time as they participate in the Quest Atlantis computer game. Data sources included classroom observations and video, pre- and post-written assessments, pre- and post- student content interviews, student field notebooks, field reports and the field notes of the researcher. To make sense of how students learning unfolded, video was analyzed using a framework of interaction analysis and small group interactions (Jordan & Henderson, 1995; Webb, 1995). These coded units were then examined with respect to student artifacts and assessments and patterns of learning trajectories analyzed. The analysis revealed that overall, student learning outcomes improved from pre- to post-assessments for all students. While there were no observable gendered differences with respect to the test scores and content interviews, there were gendered differences with respect to game play. Implications for game design, use of external scaffolds, games as tools for learning and gendered findings are discussed.
Champion, Richard A.
Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.
Kourdioukova, Elena V; Verstraete, Koenraad L; Valcke, Martin
The aim of this research was to explore (1) clinical years students' perceptions about radiology case-based learning within a computer supported collaborative learning (CSCL) setting, (2) an analysis of the collaborative learning process, and (3) the learning impact of collaborative work on the radiology cases. The first part of this study focuses on a more detailed analysis of a survey study about CSCL based case-based learning, set up in the context of a broader radiology curriculum innovation. The second part centers on a qualitative and quantitative analysis of 52 online collaborative learning discussions from 5th year and nearly graduating medical students. The collaborative work was based on 26 radiology cases regarding musculoskeletal radiology. The analysis of perceptions about collaborative learning on radiology cases reflects a rather neutral attitude that also does not differ significantly in students of different grade levels. Less advanced students are more positive about CSCL as compared to last year students. Outcome evaluation shows a significantly higher level of accuracy in identification of radiology key structures and in radiology diagnosis as well as in linking the radiological signs with available clinical information in nearly graduated students. No significant differences between different grade levels were found in accuracy of using medical terminology. Students appreciate computer supported collaborative learning settings when tackling radiology case-based learning. Scripted computer supported collaborative learning groups proved to be useful for both 5th and 7th year students in view of developing components of their radiology diagnostic approaches. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...
Kazimoglu, Cagin; Kiernan, Mary; Bacon, Liz; MacKinnon, Lachlan
This paper outlines an innovative game-based approach to learning introductory programming that is grounded in the development of computational thinking at an abstract conceptual level, but also provides a direct contextual relationship between game-play and learning traditional introductory programming. The paper proposes a possible model for,…
Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.
The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.
Full Text Available The key task in building useful e-learning repositories is to develop a system with an algorithm allowing users to retrieve information that corresponds to their specific requirements. To achieve this, products (or their verbal descriptions, i.e. presented in metadata need to be compared and structured according to the results of this comparison. Such structuring is crucial insofar as there are many search results that correspond to the entered keyword. The Hofmethode is an algorithm (based on psychological considerations to compute semantic similarities between texts and therefore offer a way to compare e-learning products. The computed similarity values are used to build semantic maps in which the products are visually arranged according to their similarities. The paper describes how the Hofmethode is implemented in the online database edulap, and how it contributes to help the user to explore the data in which he is interested.
Schrøder, Thomas; Dyre, Jeppe
A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...
van Ginneken, Bram
Half a century ago, the term "computer-aided diagnosis" (CAD) was introduced in the scientific literature. Pulmonary imaging, with chest radiography and computed tomography, has always been one of the focus areas in this field. In this study, I describe how machine learning became the dominant technology for tackling CAD in the lungs, generally producing better results than do classical rule-based approaches, and how the field is now rapidly changing: in the last few years, we have seen how even better results can be obtained with deep learning. The key differences among rule-based processing, machine learning, and deep learning are summarized and illustrated for various applications of CAD in the chest.
This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a
Wei, Guo; Joan, Lu
Abstract—Cloud computing infrastructure is increasingly\\ud used for distributed applications. Mobile learning\\ud applications deployed in the cloud are a new research\\ud direction. The applications require specific development\\ud approaches for effective and reliable communication. This\\ud paper proposes an interdisciplinary approach for design and\\ud development of mobile applications in the cloud. The\\ud approach includes front service toolkit and backend service\\ud toolkit. The front servi...
Wahyu Dwi Kurniawan
Full Text Available This study aimed to develop learning media of pneumatic systems based on computer-assisted learning as an effort to improve the competence of students at the Department of Mechanical Engineering, Faculty of Engineering UNESA. The development method referred to the 4D model design of Thiagarajan comprising the steps of: define, design, develop, and desseminate. The results showed that the expert validation average score included in both categories was 3.54, indicating the learning application acceptable. A limited test showed effective results, namely: (a Analysis of the data included in the category of learning was good (3.64, indicated by students’ enthusiasm in the learning process; (b Teaching learning activities were categorized as good, the students actively involved in learning, and the most dominant activity was doing tasks while discussing; (c Learning objectives were both achieved individually and classically; (d The students showed a positive response expressed by the students’ interest, excitement, and motivation to follow the learning process.
Okopi, Fidel Onjefu; Odeyemi, Olajumoke Janet; Adesina, Adewale
The study has identified the areas of strengths and weaknesses in the current use of Computer Based Learning (CBL) tools in Open and Distance Learning (ODL) institutions in Nigeria. To achieve these objectives, the following research questions were proposed: (i) What are the computer-based learning tools (soft and hard ware) that are actually in…
Full Text Available This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL. A behavior analysis for measuring the creative potential of computer game activities and learning outcomes is described. Creative components were measured by examining task motivation and domain-relevant and creativity-relevant skill factors. The research approach applied heuristic checklists in the field of gameplay to analyze the stage of player activities involved in the performance of the task and to examine player experiences with the Player Experience of Need Satisfaction (PENS survey. Player experiences were influenced by competency, autonomy, intuitive controls, relatedness and presence. This study examines the impact of these activities on the player experience for evaluating learning outcomes through school records. The study is designed to better understand the creative potential of people who are engaged in learning knowledge and skills during the course while playing video games. The findings show the creative potential that occurred to yield levels of creative performance within game play activities to support learning. The anticipated outcome is knowledge on how video games foster creative thinking as an overview of the Creative Potential of Learning Model (CPLN. CPLN clearly describes the interrelationships between principles of learning and creative potential, the interpretation of the results is indispensable.
Wang, Feihong; Lockee, Barbara B.; Burton, John K.
The purpose of this study was to investigate senior Chinese adults' potential acceptance of computer game-based learning (CGBL) by probing their perceptions of computer game play and their perceived impacts of game play on their learning of computer skills and life satisfaction. A total of 60 senior adults from a local senior adult learning center…
McAndrews, Gina M.; Mullen, Russell E.; Chadwick, Scott A.
Multi-media learning tools were developed to enhance student learning for an introductory agronomy course at Iowa State University. During fall 2002, the new interactive computer program, called Computer Interactive Multimedia Program for Learning Enhancement (CIMPLE) was incorporated into the teaching, learning, and assessment processes of the…
Full Text Available In the article, the acute problem of implementation of pedagogical innovations and online technologies into the educational process is analyzed. The article explores the advantages of blended learning as a latter-day educational program in comparison with traditional campus learning. Blended learning is regarded worldwide as the combination of classroom face-to-face sessions with interactive learning opportunities created online. The purpose of the article is to identify blended learning transformational potential impacting students and teachers by ensuring a more personalized learning experience. The concept of blended learning, as a means to enhance foreign language teaching and learning in the classroom during the traditional face-to-face interaction between a teacher and a student, combined with computer-mediated activities, is examined. In the article, the main classification of blended learning models is established. There are four main blended learning models which include both face-to-face instruction time and online learning: Rotation Model, Flex Model, A La Carte Model, and Enriched Virtual Model. Once implemented successfully, a blended model can take advantage of both brick-and-mortar and digital worlds, providing significant benefits for the educational establishments and learners. To integrate any of the blended learning models, a teacher can create online activities that enable learners to explore the topic online at home, and then develop face-to-face interactions to dig deeper into the subject matter at the lesson. The use of blended learning models in order to expand educational opportunities for students while the foreign language acquisition, by increasing the availability and flexibility of education, taking into account student individual learning needs, with some element of student control over time, place and pace, is explored. The realization of blended learning models in regards to age and physiological peculiarities of
Zhou, Xuan; Dai, Genghui; Huang, Shuang; Sun, Xuemin; Hu, Feng; Hu, Hongzhi; Ivanović, Mirjana
Under the modern network environment, ubiquitous learning has been a popular way for people to study knowledge, exchange ideas, and share skills in the cyberspace. Existing research findings indicate that the learners' initiative and community cohesion play vital roles in the social communities of ubiquitous learning, and therefore how to stimulate the learners' interest and participation willingness so as to improve their enjoyable experiences in the learning process should be the primary consideration on this issue. This paper aims to explore an effective method to monitor the learners' psychological reactions based on their behavioral features in cyberspace and therefore provide useful references for adjusting the strategies in the learning process. In doing so, this paper firstly analyzes the psychological assessment of the learners' situations as well as their typical behavioral patterns and then discusses the relationship between the learners' psychological reactions and their observable features in cyberspace. Finally, this paper puts forward a CyberPsychological computation method to estimate the learners' psychological states online. Considering the diversity of learners' habitual behaviors in the reactions to their psychological changes, a BP-GA neural network is proposed for the computation based on their personalized behavioral patterns. PMID:26557846
Simen A. Sørby
Full Text Available In later years, computational perspectives have become essential parts in several of the University of Oslo’s natural science studies. In this paper we discuss some main findings from a qualitative study of the computational perspectives’ impact on the students’ work with their first course in physics– mechanics – and their learning and meaning making of its contents. Discussions of the students’ learning of physics are based on sociocultural theory, which originates in Vygotsky and Bakhtin, and subsequent physics education research. Results imply that the greatest challenge for students when working with computational assignments is to combine knowledge from previously known, but separate contexts. Integrating knowledge of informatics, numerical and analytical mathematics and conceptual understanding of physics appears as a clear challenge for the students. We also observe alack of awareness concerning the limitations of physical modelling. The students need help with identifying the appropriate knowledge system or “tool set”, for the different tasks at hand; they need helpto create a plan for their modelling and to become aware of its limits. In light of this, we propose thatan instructive and dialogic text as basis for the exercises, in which the emphasis is on specification, clarification and elaboration, would be of potential great aid for students who are new to computational modelling.
Hofstede, G.J.; Jonker, C.M.; Verwaart, T.
This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,
Fletcher, D.F. [Sydney Univ., NSW (Australia). Dept. of Chemical Engineering; Witt, P.J.
In the steam explosion research field there is currently considerable effort being devoted to the modelling of premixing. Practically all models are based on the multiphase flow equations which treat the mixture as an interpenetrating continuum. Solution of these equations is non-trivial and a wide range of solution procedures are in use. This paper addresses some numerical aspects of this problem. In particular, we examine the effect of the differencing scheme for the convective terms and show that use of hybrid differencing can cause qualitatively wrong solutions in some situations. Calculations are performed for the Oxford tests, the BNL tests, a MAGICO test and to investigate various sensitivities of the solution. In addition, we show that use of a staggered grid can result in a significant error which leads to poor predictions of `melt` front motion. A correction is given which leads to excellent convergence to the analytic solution. Finally, we discuss the issues facing premixing model developers and highlight the fact that model validation is hampered more by the complexity of the process than by numerical issues. (author)
Roussel, Nicolas; Geiker, Mette Rica; Dufour, Frederic
particle flow, and numerical techniques allowing the modeling of particles suspended in a fluid. The general concept behind each family of techniques is described. Pros and cons for each technique are given along with examples and references to applications to fresh cementitious materials....
A computational approach to modeling direct metal laser sintering (DMLS) additive manufacturing process is presented. The primary application of the model is for determining the temperature history of parts fabricated using DMLS to evaluate residual stresses found in finished pieces and to assess manufacturing process strategies to reduce part slumping. The model utilizes MSC SINDA as a heat transfer solver with imbedded FORTRAN computer code to direct laser motion, apply laser heating as a boundary condition, and simulate the addition of metal powder layers during part fabrication. Model results are compared to available data collected during in situ DMLS part manufacture.
Full Text Available The paper analyses the Minority Game and focuses on analysis and computational modelling of several variants (variable payoff, coalition-based and ternary voting of Minority Game using UAREI (User-Action-Rule-Entities-Interface model. UAREI is a model for formal specification of software gamification, and the UAREI visual modelling language is a language used for graphical representation of game mechanics. The URAEI model also provides the embedded executable modelling framework to evaluate how the rules of the game will work for the players in practice. We demonstrate flexibility of UAREI model for modelling different variants of Minority Game rules for game design.
Sarirete, Akila; Noble, Elizabeth; Chikh, Azeddine
Contemporary students are characterized by having very applied learning styles and methods of acquiring knowledge. This behavior is consistent with the constructivist models where students are co-partners in the learning process. In the present work the authors developed a new model of learning based on the constructivist theory coupled with the cognitive development theory of Piaget. The model considers the level of learning based on several stages and the move from one stage to another requires learners' challenge. At each time a new concept is introduced creates a disequilibrium that needs to be worked out to return back to its equilibrium stage. This process of "disequilibrium/equilibrium" has been analyzed and validated using a course in computer networking as part of Cisco Networking Academy Program at Effat College, a women college in Saudi Arabia. The model provides a theoretical foundation for teaching especially in a complex knowledge domain such as engineering and can be used in a knowledge economy.
Zaikin, Oleg; Malinowska, Magdalena; Kofoed, Lise B.
The article contains the concept of developing a motivation model aimed at supporting activity of both students and teachers in the process of implementing and using an open and distance learning system. Proposed motivation model is focused on the task of filling the knowledge repository with high...... quality didactic material. Open and distance learning system assures a computer space for the teaching/learning process in open environment. The structure of the motivation model and formal assumptions are described. Additionally, there is presented a structure of the linguistic database, helping...... the teacher to assess the student's motivation and the basic simulation model to analysis the teaching/learning process constrains. The proposed approach is based on the games theory and simulation approach....
Aslin, Richard N; Newport, Elissa L
In the past 15 years, a substantial body of evidence has confirmed that a powerful distributional learning mechanism is present in infants, children, adults and (at least to some degree) in nonhuman animals as well. The present article briefly reviews this literature and then examines some of the fundamental questions that must be addressed for any distributional learning mechanism to operate effectively within the linguistic domain. In particular, how does a naive learner determine the number of categories that are present in a corpus of linguistic input and what distributional cues enable the learner to assign individual lexical items to those categories? Contrary to the hypothesis that distributional learning and category (or rule) learning are separate mechanisms, the present article argues that these two seemingly different processes---acquiring specific structure from linguistic input and generalizing beyond that input to novel exemplars---actually represent a single mechanism. Evidence in support of this single-mechanism hypothesis comes from a series of artificial grammar-learning studies that not only demonstrate that adults can learn grammatical categories from distributional information alone, but that the specific patterning of distributional information among attested utterances in the learning corpus enables adults to generalize to novel utterances or to restrict generalization when unattested utterances are consistently absent from the learning corpus. Finally, a computational model of distributional learning that accounts for the presence or absence of generalization is reviewed and the implications of this model for linguistic-category learning are summarized.
Hsu, Ching-Kun; Hwang, Gwo-Jen
Personal computer assembly courses have been recognized as being essential in helping students understand computer structure as well as the functionality of each computer component. In this study, a context-aware ubiquitous learning approach is proposed for providing instant assistance to individual students in the learning activity of a…
Kilfoil, W. R.
This article looks at the way in which people perceive learning and the impact of these perceptions on teaching methods within the context of learning development in distance education. The context could, in fact, be any type of teaching and learning environment. The point is to balance approaches to teaching and learning depending on student…
Full Text Available Our nervous system continuously combines new information from our senses with information it has acquired throughout life. Numerous studies have found that human subjects manage this by integrating their observations with their previous experience (priors in a way that is close to the statistical optimum. However, little is known about the way the nervous system acquires or learns priors. Here we present results from experiments where the underlying distribution of target locations in an estimation task was switched, manipulating the prior subjects should use. Our experimental design allowed us to measure a subject's evolving prior while they learned. We confirm that through extensive practice subjects learn the correct prior for the task. We found that subjects can rapidly learn the mean of a new prior while the variance is learned more slowly and with a variable learning rate. In addition, we found that a Bayesian inference model could predict the time course of the observed learning while offering an intuitive explanation for the findings. The evidence suggests the nervous system continuously updates its priors to enable efficient behavior.
Krijnen, Thomas; Tamke, Martin
architects and engineers are able to deduce non-explicitly explicitly stated information, which is often the core of the transported architectural information. This paper investigates how machine learning approaches allow a computational system to deduce implicit knowledge from a set of BIM models....
Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.
Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.
Varner, Victor D; Nelson, Celeste M
The bronchial network of the mammalian lung consists of millions of dichotomous branches arranged in a highly complex, space-filling tree. Recent computational models of branching morphogenesis in the lung have helped uncover the biological mechanisms that construct this ramified architecture. In this review, we focus on three different theoretical approaches - geometric modeling, reaction-diffusion modeling, and continuum mechanical modeling - and discuss how, taken together, these models have identified the geometric principles necessary to build an efficient bronchial network, as well as the patterning mechanisms that specify airway geometry in the developing embryo. We emphasize models that are integrated with biological experiments and suggest how recent progress in computational modeling has advanced our understanding of airway branching morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Simonovski, Igor; Cizelj, Leon
A novel computational approach for simulation of intergranular cracks in a polycrystalline aggregate is proposed in this paper. The computational model includes a topological model of the experimentally determined microstructure of a 400 μm diameter stainless steel wire and automatic finite element discretization of the grains and grain boundaries. The microstructure was spatially characterized by X-ray diffraction contrast tomography and contains 362 grains and some 1600 grain boundaries. Available constitutive models currently include isotropic elasticity for the grain interior and cohesive behavior with damage for the grain boundaries. The experimentally determined lattice orientations are employed to distinguish between resistant low energy and susceptible high energy grain boundaries in the model. The feasibility and performance of the proposed computational approach is demonstrated by simulating the onset and propagation of intergranular cracking. The preliminary numerical results are outlined and discussed.
Obrenovic, Z.; Starcevic, D.
Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze
McDowell, J. J.
Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…
Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of
Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei
Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…
Bravo, C.; van Joolingen, W.R.; de Jong, T.
Inquiry learning is a didactic approach in which students acquire knowledge and skills through processes of theory building and experimentation. Computer modeling and simulation can play a prominent role within this approach. Students construct representations of physical systems using modeling.
With their realistic animation, complex scenarios and impressive interactivity, computer simulation games might be able to provide context-rich, cognitively engaging virtual environments for language learning. However, simulation games designed for L2 learners are in short supply. As an alternative, could games designed for the mass-market be…
Melendez Rodriguez, J.C.; Ginneken, B. van; Maduskar, P.; Philipsen, R.H.H.M.; Ayles, H.; Sanchez, C.I.
The major advantage of multiple-instance learning (MIL) applied to a computer-aided detection (CAD) system is that it allows optimizing the latter with case-level labels instead of accurate lesion outlines as traditionally required for a supervised approach. As shown in previous work, a MIL-based
action block in a flowchart of the system being modeled. For instance, the process of capturing a facility for some length of time and then...because of the abundance of tutorial material available; whereas, far less complete 47 tutorial material is available to beginners learning SIMSCRIPT
Full Text Available Universities around the world especially those in developing countries are faced with the problem of delivering the level of information and communications technology (ICT needed to facilitate teaching, learning, research, and development activities ideal in a typical university, which is needed to meet educational needs in-line with advancement in technology and the growing dependence on IT. This is mainly due to the high cost involved in providing and maintaining the needed hardware and software. A technology such as cloud computing that delivers on demand provisioning of IT resources on a pay per use basis can be used to address this problem. Cloud computing promises better delivery of IT services as well as availability whenever and wherever needed at reduced costs with users paying only as much as they consume through the services of cloud service providers. The cloud technology reduces complexity while increasing speed and quality of IT services provided; however, despite these benefits the challenges that come with its adoption have left many sectors especially the higher education skeptical in committing to this technology. This article identifies the reasons for the slow rate of adoption of cloud computing at university level, discusses the challenges faced and proposes a cloud computing adoption model that contains strategic guidelines to overcome the major challenges identified and a roadmap for the successful adoption of cloud computing by universities. The model was tested in one of the universities and found to be both useful and appropriate for adopting cloud computing at university level.
Ahmadpanah, Seyed Hossein
In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...
Douglas, S.R.; Chaplin, K.R.
The computer program EWE simulates the propagation of elastic waves in solids and liquids. It has been applied to ultrasonic testing to study the echoes generated by cracks and other types of defects. A discussion of the elastic wave equations is given, including the first-order formulation, shear and compression waves, surface waves and boundaries, numerical method of solution, models for cracks and slot defects, input wave generation, returning echo construction, and general computer issues
Greenberg, D P
During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.
Langtangen, Hans Petter
This text provides a very simple, initial introduction to the complete scientific computing pipeline: models, discretization, algorithms, programming, verification, and visualization. The pedagogical strategy is to use one case study – an ordinary differential equation describing exponential decay processes – to illustrate fundamental concepts in mathematics and computer science. The book is easy to read and only requires a command of one-variable calculus and some very basic knowledge about computer programming. Contrary to similar texts on numerical methods and programming, this text has a much stronger focus on implementation and teaches testing and software engineering in particular. .
Hammond, John S., III
Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)
Gasparinatou, Alexandra; Grigoriadou, Maria
Previous studies have shown that students with low knowledge understand and learn better from more cohesive texts, whereas high-knowledge students have been shown to learn better from texts of lower cohesion. This study examines whether high-knowledge readers in computer science benefit from a text of low cohesion. Undergraduate students (n = 65) read one of four versions of a text concerning Local Network Topologies, orthogonally varying local and global cohesion. Participants' comprehension was examined through free-recall measure, text-based, bridging-inference, elaborative-inference, problem-solving questions and a sorting task. The results indicated that high-knowledge readers benefited from the low-cohesion text. The interaction of text cohesion and knowledge was reliable for the sorting activity, for elaborative-inference and for problem-solving questions. Although high-knowledge readers performed better in text-based and in bridging-inference questions with the low-cohesion text, the interaction of text cohesion and knowledge was not reliable. The results suggest a more complex view of when and for whom textual cohesion affects comprehension and consequently learning in computer science.
This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering. Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...
Mao, Hua; Chen, Yingke; Jaeger, Manfred
. The proposed learning algorithm is adapted from algorithms for learning deterministic probabilistic finite automata, and extended to include both probabilistic and nondeterministic transitions. The algorithm is empirically analyzed and evaluated by learning system models of slot machines. The evaluation......Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... on learning probabilistic automata to reactive systems, where the observed system behavior is in the form of alternating sequences of inputs and outputs. We propose an algorithm for automatically learning a deterministic labeled Markov decision process model from the observed behavior of a reactive system...
Phadung, Muneeroh; Wani, Najela; Tongmnee, Nur-aiynee
Educators need to provide the alternative educational tools to foster learning outcomes of students. By using AR technology to create exciting edutainment experiences, this paper presents how augmented reality (AR) can be applied in the education. This study aims to develop the AR book for tenth grade students (age 15-16) and evaluate its quality. The AR book was developed based on ADDIE framework processes to provide computer learning on software computer knowledge. The content was accorded with the current Thai education curriculum. The AR book had 10 pages in three topics (the first was "Introduction," the second was "System Software" and the third was "Application Software"). Each page contained markers that placed virtual objects (2D animation and video clip). The obtained data were analyzed in terms of average and standard deviation. The validity of multimedia design of the AR book was assessed by three experts in multimedia design. A five-point Likert scale was used and the values were X¯ =4 .84 , S.D. = 1.27 which referred to very high. Moreover, three content experts, who specialize in computer teaching, evaluated the AR book's validity. The values determined by the experts were X¯ =4 .69 , S.D. = 0.29 which referred to very high. Implications for future study and education are discussed.
Zadahmad, Manouchehr; Yousefzadehfard, Parisa
Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…
Tsai, Chia-Wen; Shen, Pei-Di; Tsai, Meng-Chuan; Chen, Wen-Yu
Much application software education in Taiwan can hardly be regarded as practical. The researchers in this study provided a flexible means of ubiquitous learning (u-learning) with a mobile app for students to access the learning material. In addition, the authors also adopted computational thinking (CT) to help students develop practical computing…
Purushotham, Sanjay; Meng, Chuizheng; Che, Zhengping; Liu, Yan
Deep learning models (aka Deep Neural Networks) have revolutionized many fields including computer vision, natural language processing, speech recognition, and is being increasingly used in clinical healthcare applications. However, few works exist which have benchmarked the performance of the deep learning models with respect to the state-of-the-art machine learning models and prognostic scoring systems on publicly available healthcare datasets. In this paper, we present the benchmarking results for several clinical prediction tasks such as mortality prediction, length of stay prediction, and ICD-9 code group prediction using Deep Learning models, ensemble of machine learning models (Super Learner algorithm), SAPS II and SOFA scores. We used the Medical Information Mart for Intensive Care III (MIMIC-III) (v1.4) publicly available dataset, which includes all patients admitted to an ICU at the Beth Israel Deaconess Medical Center from 2001 to 2012, for the benchmarking tasks. Our results show that deep learning models consistently outperform all the other approaches especially when the 'raw' clinical time series data is used as input features to the models. Copyright © 2018 Elsevier Inc. All rights reserved.
Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.
This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.
Three adaptive single-neuron models based on neural analogies of behavior modification episodes are proposed, which attempt to bridge the gap between psychology and neurophysiology. The proposed models capture the predictive nature of Pavlovian conditioning, which is essential to the theory of adaptive/learning systems. The models learn to anticipate the occurrence of a conditioned response before the presence of a reinforcing stimulus when training is complete. Furthermore, each model can find the most nonredundant and earliest predictor of reinforcement. The behavior of the models accounts for several aspects of basic animal learning phenomena in Pavlovian conditioning beyond previous related models. Computer simulations show how well the models fit empirical data from various animal learning paradigms.
Gorsev, Gonca; Turkmen, Ugur; Askin, Cihat
In today's world, in order to obtain the information in education, various approaches, methods and devices have been developed. Like many developing countries, e-learning and distance learning (internet based learning) are used today in many areas of education in Turkey. This research aims to contribute to education systems and develop a…
Full Text Available Learning a Bayesian network from a numeric set of data is a challenging task because of dual nature of learning process: initial need to learn network structure, and then to find out the distribution probability tables. In this paper, we propose a machine-learning algorithm based on hill climbing search combined with Tabu list. The aim of learning process is to discover the best network that represents dependences between nodes. Another issue in machine learning procedure is handling numeric attributes. In order to do that, we must perform an attribute discretization pre-processes. This discretization operation can influence the results of learning network structure. Therefore, we make a comparative study to find out the most suitable combination between discretization method and learning algorithm, for a specific data set.
Seal, Kala Chand; Przasnyski, Zbigniew H.; Leon, Linda A.
Do students learn to model OR/MS problems better by using computer-based interactive tutorials and, if so, does increased interactivity in the tutorials lead to better learning? In order to determine the effect of different levels of interactivity on student learning, we used screen capture technology to design interactive support materials for…
Hromadka, J; Ibehej, T; Hrach, R
Our contribution is devoted to development of hybrid modelling techniques. We investigate sheath structures in the vicinity of solids immersed in low temperature argon plasma of different pressures by means of particle and fluid computer models. We discuss the differences in results obtained by these methods and try to propose a way to improve the results of fluid models in the low pressure area. There is a possibility to employ Chapman-Enskog method to find appropriate closure relations of fluid equations in a case when particle distribution function is not Maxwellian. We try to follow this way to enhance fluid model and to use it in hybrid plasma model further. (paper)
Ramadiani; Rodziah, A.; Hasan, S. M.; Rusli, A.; Noraini, C.
E-learning is not going to work if the system is not used in accordance with user needs. User Interface is very important to encourage using the application. Many theories had discuss about user interface usability evaluation and technology acceptance separately, actually why we do not make it correlation between interface usability evaluation and user acceptance to enhance e-learning process. Therefore, the evaluation model for e-learning interface acceptance is considered important to investigate. The aim of this study is to propose the integrated e-learning user interface acceptance evaluation model. This model was combined some theories of e-learning interface measurement such as, user learning style, usability evaluation, and the user benefit. We formulated in constructive questionnaires which were shared at 125 English Language School (ELS) students. This research statistics used Structural Equation Model using LISREL v8.80 and MANOVA analysis.
This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease. The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.
Rodríguez Vega, Martín.
Computational Algebraic Geometry is applied to the analysis of various epidemic models for Schistosomiasis and Dengue, both, for the case without control measures and for the case where control measures are applied. The models were analyzed using the mathematical software Maple. Explicitly the analysis is performed using Groebner basis, Hilbert dimension and Hilbert polynomials. These computational tools are included automatically in Maple. Each of these models is represented by a system of ordinary differential equations, and for each model the basic reproductive number (R0) is calculated. The effects of the control measures are observed by the changes in the algebraic structure of R0, the changes in Groebner basis, the changes in Hilbert dimension, and the changes in Hilbert polynomials. It is hoped that the results obtained in this paper become of importance for designing control measures against the epidemic diseases described. For future researches it is proposed the use of algebraic epidemiology to analyze models for airborne and waterborne diseases.
Nicoulin, C.V.; Jacobs, P.C.; Tory, S.
The use of computer models to simulate the energy performance of large commercial refrigeration systems typically found in food processing facilities is an area of engineering practice that has seen little development to date. Current techniques employed in predicting energy consumption by such systems have focused on temperature bin methods of analysis. Existing simulation tools such as DOE2 are designed to model commercial buildings and grocery store refrigeration systems. The HVAC and Refrigeration system performance models in these simulations tools model equipment common to commercial buildings and groceries, and respond to energy-efficiency measures likely to be applied to these building types. The applicability of traditional building energy simulation tools to model refrigerated warehouse performance and analyze energy-saving options is limited. The paper will present the results of modeling work undertaken to evaluate energy savings resulting from incentives offered by a California utility to its Refrigerated Warehouse Program participants. The TRNSYS general-purpose transient simulation model was used to predict facility performance and estimate program savings. Custom TRNSYS components were developed to address modeling issues specific to refrigerated warehouse systems, including warehouse loading door infiltration calculations, an evaporator model, single-state and multi-stage compressor models, evaporative condenser models, and defrost energy requirements. The main focus of the paper will be on the modeling approach. The results from the computer simulations, along with overall program impact evaluation results, will also be presented
Taras P. Kobylnyk
Full Text Available The article describes the general characteristics of the most popular computer mathematics systems such as commercial (Maple, Mathematica, Matlab and open source (Scilab, Maxima, GRAN, Sage, as well as the conditions of use of these systems as means of fundamentalization of the educational process of bachelor of informatics. It is considered the role of CMS in bachelor of informatics training. It is identified the approaches of CMS pedagogical use while learning information and physics and mathematics disciplines. There are presented some tasks, in which we must carefully use the «responses» have been received using CMS. It is identified the promising directions of development of computer mathematics systems in high-tech environment.
This paper aims at providing an overview and a critical analysis of the technological learning concept and its incorporation in energy-environment-economy models. A special emphasis is put on surveying and discussing, through the so-called learning curve, both studies estimating learning rates in the energy field and studies incorporating endogenous technological learning in bottom-up and top-down models. The survey of learning rate estimations gives special attention to interpreting and explaining the sources of variability of estimated rates, which is shown to be mainly inherent in R and D expenditures, the problem of omitted variable bias, the endogeneity relationship and the role of spillovers. Large-scale models survey show that, despite some methodological and computational complexity related to the non-linearity and the non-convexity associated with the learning curve incorporation, results of the numerous modelling experiments give several new insights with regard to the analysis of the prospects of specific technological options and their cost decrease potential (bottom-up models), and with regard to the analysis of strategic considerations, especially inherent in the innovation and energy diffusion process, in particular the energy sector's endogenous responses to environment policy instruments (top-down models)
Kon, Haruka; Kobayashi, Hiroshi; Sakurai, Naoki; Watanabe, Kiyoshi; Yamaga, Yoshiro; Ono, Takahiro
The aim of the present study was to clarify differences between personal computer (PC)/mobile device combination and PC-only user patterns. We analyzed access frequency and time spent on a complete denture preclinical website in order to maximize website effectiveness. Fourth-year undergraduate students (N=41) in the preclinical complete denture laboratory course were invited to participate in this survey during the final week of the course to track login data. Students accessed video demonstrations and quizzes via our e-learning site/course program, and were instructed to view online demonstrations before classes. When the course concluded, participating students filled out a questionnaire about the program, their opinions, and devices they had used to access the site. Combination user access was significantly more frequent than PC-only during supplementary learning time, indicating that students with mobile devices studied during lunch breaks and before morning classes. Most students had favorable opinions of the e-learning site, but a few combination users commented that some videos were too long and that descriptive answers were difficult on smartphones. These results imply that mobile devices' increased accessibility encouraged learning by enabling more efficient time use between classes. They also suggest that e-learning system improvements should cater to mobile device users by reducing video length and including more short-answer questions. © 2016 John Wiley & Sons Australia, Ltd.
Watanabe, Takashi; Sugi, Yoshihiro
Feedforward controller would be useful for hybrid Functional Electrical Stimulation (FES) system using powered orthotic devices. In this paper, Feedback Error Learning (FEL) controller for FES (FEL-FES controller) was examined using an inverse statics model (ISM) with an inverse dynamics model (IDM) to realize a feedforward FES controller. For FES application, the ISM was tested in learning off line using training data obtained by PID control of very slow movements. Computer simulation tests ...
Full Text Available Teaching projects which make use of new technology are becoming of interest to all academic institutions in the UK due to economic pressure to increase student numbers. CMC (Computer- Mediated Communication such as computer conferencing appears an attractive solution to higher education's 'numbers' problem, with the added benefit that it is free from time and place constraints. Researchers have discussed CMC from a number of different perspectives, for example Mason and Kaye (1989 describe a CMC system as a system of interactivity between tutors, students, resources and organizational structure. Steeples et al (1993 consider CMC in terms of group cohesion, modes of discourse and intervention strategies to stimulate and structure participation. Goodyear et al (1994 discuss the Just in Time (TT-Based Open Learning (JTTOL model in terms of a set of educational beliefs, role definitions, working methods and learning resources, together with a definition of infrastructure requirements for CMC. Shedletsky (1993 suggests that a CMC should be viewed in terms of an 'intrapersonal communication' model, while Mayes et al (1994 identify three types of learning which is mediated by telematics, that is, learning by conceptualization, construction and dialogue. Other researchers, such as Velayo (1994, describe the teacher as 'an active agent', and present a model for computer conferencing which neglects the social aspect of CMC, while Berge (1995 mentions the importance of social activity between students and the importance of the role of the moderator. From these accounts, there appear to be a number of dimensions which can be used to evaluate CMC. Not all researchers emphasize the same dimensions; however, this paper proposes that computer conferencing systems should be designed to encourage students to participate in all three of the following dimensions. These can be summarized as: (a a knowledge dimension (includes domain and meta knowledge; (b a social
Full Text Available This work discusses the development of information technology service management using cloud computing approach to improve the performance of administration system and online learning at STMIK IBBI Medan, Indonesia. The network topology is modeled and simulated for system administration and online learning. The same network topology is developed in cloud computing using Amazon AWS architecture. The model is designed and modeled using Riverbed Academic Edition Modeler to obtain values of the parameters: delay, load, CPU utilization, and throughput. The simu- lation results are the following. For network topology 1, without cloud computing, the average delay is 54 ms, load 110 000 bits/s, CPU utilization 1.1%, and throughput 440 bits/s. With cloud computing, the average delay is 45 ms, load 2 800 bits/s, CPU utilization 0.03%, and throughput 540 bits/s. For network topology 2, without cloud computing, the average delay is 39 ms, load 3 500 bits/s, CPU utilization 0.02%, and throughput database server 1 400 bits/s. With cloud computing, the average delay is 26 ms, load 5 400 bits/s, CPU utilization email server 0.0001%, FTP server 0.001%, HTTP server 0.0002%, throughput email server 85 bits/s, FTP server 100 bits/sec, and HTTP server 95 bits/s. Thus, the delay, the load, and the CPU utilization decrease; but, the throughput increases. Information technology service management with cloud computing approach has better performance.
Love, Bradley C; Medin, Douglas L; Gureckis, Todd M
SUSTAIN (Supervised and Unsupervised STratified Adaptive Incremental Network) is a model of how humans learn categories from examples. SUSTAIN initially assumes a simple category structure. If simple solutions prove inadequate and SUSTAIN is confronted with a surprising event (e.g., it is told that a bat is a mammal instead of a bird), SUSTAIN recruits an additional cluster to represent the surprising event. Newly recruited clusters are available to explain future events and can themselves evolve into prototypes-attractors-rules. SUSTAIN's discovery of category substructure is affected not only by the structure of the world but by the nature of the learning task and the learner's goals. SUSTAIN successfully extends category learning models to studies of inference learning, unsupervised learning, category construction, and contexts in which identification learning is faster than classification learning.
Michael Alexander Nugent
Full Text Available Modern computing architecture based on the separation of memory and processing leads to a well known problem called the von Neumann bottleneck, a restrictive limit on the data bandwidth between CPU and RAM. This paper introduces a new approach to computing we call AHaH computing where memory and processing are combined. The idea is based on the attractor dynamics of volatile dissipative electronics inspired by biological systems, presenting an attractive alternative architecture that is able to adapt, self-repair, and learn from interactions with the environment. We envision that both von Neumann and AHaH computing architectures will operate together on the same machine, but that the AHaH computing processor may reduce the power consumption and processing time for certain adaptive learning tasks by orders of magnitude. The paper begins by drawing a connection between the properties of volatility, thermodynamics, and Anti-Hebbian and Hebbian (AHaH plasticity. We show how AHaH synaptic plasticity leads to attractor states that extract the independent components of applied data streams and how they form a computationally complete set of logic functions. After introducing a general memristive device model based on collections of metastable switches, we show how adaptive synaptic weights can be formed from differential pairs of incremental memristors. We also disclose how arrays of synaptic weights can be used to build a neural node circuit operating AHaH plasticity. By configuring the attractor states of the AHaH node in different ways, high level machine learning functions are demonstrated. This includes unsupervised clustering, supervised and unsupervised classification, complex signal prediction, unsupervised robotic actuation and combinatorial optimization of procedures-all key capabilities of biological nervous systems and modern machine learning algorithms with real world application.
Ren, Yong; Wang, Yining; Zhu, Jun
Supervised topic models simultaneously model the latent topic structure of large collections of documents and a response variable associated with each document. Existing inference methods are based on variational approximation or Monte Carlo sampling, which often suffers from the local minimum defect. Spectral methods have been applied to learn unsupervised topic models, such as latent Dirichlet allocation (LDA), with provable guarantees. This paper investigates the possibility of applying spectral methods to recover the parameters of supervised LDA (sLDA). We first present a two-stage spectral method, which recovers the parameters of LDA followed by a power update method to recover the regression model parameters. Then, we further present a single-phase spectral algorithm to jointly recover the topic distribution matrix as well as the regression weights. Our spectral algorithms are provably correct and computationally efficient. We prove a sample complexity bound for each algorithm and subsequently derive a sufficient condition for the identifiability of sLDA. Thorough experiments on synthetic and real-world datasets verify the theory and demonstrate the practical effectiveness of the spectral algorithms. In fact, our results on a large-scale review rating dataset demonstrate that our single-phase spectral algorithm alone gets comparable or even better performance than state-of-the-art methods, while previous work on spectral methods has rarely reported such promising performance.
Freifeld, Oren; Hauberg, Søren; Black, Michael J.
We consider the intersection of two research fields: transfer learning and statistics on manifolds. In particular, we consider, for manifold-valued data, transfer learning of tangent-space models such as Gaussians distributions, PCA, regression, or classifiers. Though one would hope to simply use...... ordinary Rn-transfer learning ideas, the manifold structure prevents it. We overcome this by basing our method on inner-product-preserving parallel transport, a well-known tool widely used in other problems of statistics on manifolds in computer vision. At first, this straightforward idea seems to suffer...... “commutes” with learning. Consequently, our compact framework, applicable to a large class of manifolds, is not restricted by the size of either the training or test sets. We demonstrate the approach by transferring PCA and logistic-regression models of real-world data involving 3D shapes and image...
da Silva, Ketia Kellen A.; Behar, Patricia A.
This article presents the development of a digital competency model of Distance Learning (DL) students in Brazil called CompDigAl_EAD. The following topics were addressed in this study: Educational Competences, Digital Competences, and Distance Learning students. The model was developed between 2015 and 2016 and is being validated in 2017. It was…
Bossche, van den P.; Gijselaers, W.; Segers, M.; Woltjer, G.B.; Kirschner, P.
To gain insight in the social processes that underlie knowledge sharing in teams, this article questions which team learning behaviors lead to the construction of a shared mental model. Additionally, it explores how the development of shared mental models mediates the relation between team learning
Manfredotti, Cristina; Pedersen, Kim Steenstrup; Hamilton, Howard J.
We propose the LEMAIO multi-layer framework, which makes use of hierarchical abstraction to learn models for activities involving multiple interacting objects from time sequences of data concerning the individual objects. Experiments in the sea navigation domain yielded learned models that were t...
Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan
The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...
The paper gives a description of mathematical models and computer programs for analysing possible strategies for spent fuel management, with emphasis on economic analysis. The computer programs developed, describe the material flows, facility construction schedules, capital investment schedules and operating costs for the facilities used in managing the spent fuel. The computer programs use a combination of simulation and optimization procedures for the economic analyses. Many of the fuel cycle steps (such as spent fuel discharges, storage at the reactor, and transport to the RFCC) are described in physical and economic terms through simulation modeling, while others (such as reprocessing plant size and commissioning schedules, interim storage facility commissioning schedules etc.) are subjected to economic optimization procedures to determine the approximate lowest-cost plans from among the available feasible alternatives
Hayati .; Retno Dwi Suyanti
The objective in this research: (1) Determine a better learning model to improve learning outcomes physics students among learning model Inquiry Training based multimedia and Inquiry Training learning model. (2) Determine the level of motivation to learn in affects physics student learning outcomes. (3) Knowing the interactions between the model of learning and motivation in influencing student learning outcomes. This research is a quasi experimental. The population in this research was all s...
Martin, Amy L.; Reeves, Todd D.; Smith, Thomas J.; Walker, David A.
Online learning is variously employed in K-12 education, including for teacher professional development. However, the use of computer-based technologies for learning purposes assumes learner computer proficiency, making this construct an important domain of procedural knowledge in formal and informal online learning contexts. Addressing this…
Li, Lung-Yu; Lee, Long-Yuan
The purpose of this study was to explore graduate students' competencies in computer use and their attitudes toward online learning in asynchronous online courses of distance learning programs in a Graduate School of Education (GSOE) in Taiwan. The research examined the relationship between computer literacy and the online learning attitudes of…
Mayer, Richard E.
Computer games for learning (also called video games or digital games) have potential to improve education. This is the intriguing idea that motivates this special issue of the "Educational Psychologist" on "Psychological Perspectives on Digital Games and Learning." Computer games for learning are games delivered via computer…
Ngai, E. W. T.; Lam, S. S.; Poon, J. K. L.
This paper describes the successful application of a computer-supported collaborative learning system in teaching e-commerce. The authors created a teaching and learning environment for 39 local secondary schools to introduce e-commerce using a computer-supported collaborative learning system. This system is designed to equip students with…