WorldWideScience

Sample records for learned codes current

  1. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    Science.gov (United States)

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  2. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  3. Current lead thermal analysis code 'CURRENT'

    International Nuclear Information System (INIS)

    Yamaguchi, Masahito; Tada, Eisuke; Shimamoto, Susumu; Hata, Kenichiro.

    1985-08-01

    Large gas-cooled current lead with the capacity more than 30 kA and 22 kV is required for superconducting toroidal and poloidal coils for fusion application. The current lead is used to carry electrical current from the power supply system at room temperature to the superconducting coil at 4 K. Accordingly, the thermal performance of the current lead is significantly important to determine the heat load requirements of the coil system at 4 K. Japan Atomic Energy Research Institute (JAERI) has being developed the large gas-cooled current leads with the optimum condition in which the heat load is around 1 W per 1 kA at 4 K. In order to design the current lead with the optimum thermal performances, JAERI developed thermal analysis code named as ''CURRENT'' which can theoretically calculate the optimum geometric shape and cooling conditions of the current lead. The basic equations and the instruction manual of the analysis code are described in this report. (author)

  4. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  5. Learning and coding in biological neural networks

    Science.gov (United States)

    Fiete, Ila Rani

    How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above. We first study the role of sparse neural activity, as seen in the coding of sequential commands in a premotor area responsible for birdsong. We show that the sparse coding of temporal sequences in the songbird brain can, in a network where the feedforward plastic weights must translate the sparse sequential code into a time-varying muscle code, facilitate learning by minimizing synaptic interference. Next, we propose a biologically plausible synaptic plasticity rule that can perform goal-directed learning in recurrent networks of voltage-based spiking neurons that interact through conductances. Learning is based on the correlation of noisy local activity with a global reward signal; we prove that this rule performs stochastic gradient ascent on the reward. Thus, if the reward signal quantifies network performance on some desired task, the plasticity rule provably drives goal-directed learning in the network. To assess the convergence properties of the learning rule, we compare it with a known example of learning in the brain. Song-learning in finches is a clear example of a learned behavior, with detailed available neurophysiological data. With our learning rule, we train an anatomically accurate model birdsong network that drives a sound source to mimic an actual zebrafinch song. Simulation and

  6. An analytical demonstration of coupling schemes between magnetohydrodynamic codes and eddy current codes

    International Nuclear Information System (INIS)

    Liu Yueqiang; Albanese, R.; Rubinacci, G.; Portone, A.; Villone, F.

    2008-01-01

    In order to model a magnetohydrodynamic (MHD) instability that strongly couples to external conducting structures (walls and/or coils) in a fusion device, it is often necessary to combine a MHD code solving for the plasma response, with an eddy current code computing the fields and currents of conductors. We present a rigorous proof of the coupling schemes between these two types of codes. One of the coupling schemes has been introduced and implemented in the CARMA code [R. Albanese, Y. Q. Liu, A. Portone, G. Rubinacci, and F. Villone, IEEE Trans. Magn. 44, 1654 (2008); A. Portone, F. Villone, Y. Q. Liu, R. Albanese, and G. Rubinacci, Plasma Phys. Controlled Fusion 50, 085004 (2008)] that couples the MHD code MARS-F[Y. Q. Liu, A. Bondeson, C. M. Fransson, B. Lennartson, and C. Breitholtz, Phys. Plasmas 7, 3681 (2000)] and the eddy current code CARIDDI[R. Albanese and G. Rubinacci, Adv. Imaging Electron Phys. 102, 1 (1998)]. While the coupling schemes are described for a general toroidal geometry, we give the analytical proof for a cylindrical plasma.

  7. A Fast Optimization Method for General Binary Code Learning.

    Science.gov (United States)

    Shen, Fumin; Zhou, Xiang; Yang, Yang; Song, Jingkuan; Shen, Heng; Tao, Dacheng

    2016-09-22

    Hashing or binary code learning has been recognized to accomplish efficient near neighbor search, and has thus attracted broad interests in recent retrieval, vision and learning studies. One main challenge of learning to hash arises from the involvement of discrete variables in binary code optimization. While the widely-used continuous relaxation may achieve high learning efficiency, the pursued codes are typically less effective due to accumulated quantization error. In this work, we propose a novel binary code optimization method, dubbed Discrete Proximal Linearized Minimization (DPLM), which directly handles the discrete constraints during the learning process. Specifically, the discrete (thus nonsmooth nonconvex) problem is reformulated as minimizing the sum of a smooth loss term with a nonsmooth indicator function. The obtained problem is then efficiently solved by an iterative procedure with each iteration admitting an analytical discrete solution, which is thus shown to converge very fast. In addition, the proposed method supports a large family of empirical loss functions, which is particularly instantiated in this work by both a supervised and an unsupervised hashing losses, together with the bits uncorrelation and balance constraints. In particular, the proposed DPLM with a supervised `2 loss encodes the whole NUS-WIDE database into 64-bit binary codes within 10 seconds on a standard desktop computer. The proposed approach is extensively evaluated on several large-scale datasets and the generated binary codes are shown to achieve very promising results on both retrieval and classification tasks.

  8. 2-D skin-current toroidal-MHD-equilibrium code

    International Nuclear Information System (INIS)

    Feinberg, B.; Niland, R.A.; Coonrod, J.; Levine, M.A.

    1982-09-01

    A two-dimensional, toroidal, ideal MHD skin-current equilibrium computer code is described. The code is suitable for interactive implementation on a minicomptuer. Some examples of the use of the code for design and interpretation of toroidal cusp experiments are presented

  9. Code-specific learning rules improve action selection by populations of spiking neurons.

    Science.gov (United States)

    Friedrich, Johannes; Urbanczik, Robert; Senn, Walter

    2014-08-01

    Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.

  10. Towards Automatic Learning of Heuristics for Mechanical Transformations of Procedural Code

    Directory of Open Access Journals (Sweden)

    Guillermo Vigueras

    2017-01-01

    Full Text Available The current trends in next-generation exascale systems go towards integrating a wide range of specialized (co-processors into traditional supercomputers. Due to the efficiency of heterogeneous systems in terms of Watts and FLOPS per surface unit, opening the access of heterogeneous platforms to a wider range of users is an important problem to be tackled. However, heterogeneous platforms limit the portability of the applications and increase development complexity due to the programming skills required. Program transformation can help make programming heterogeneous systems easier by defining a step-wise transformation process that translates a given initial code into a semantically equivalent final code, but adapted to a specific platform. Program transformation systems require the definition of efficient transformation strategies to tackle the combinatorial problem that emerges due to the large set of transformations applicable at each step of the process. In this paper we propose a machine learning-based approach to learn heuristics to define program transformation strategies. Our approach proposes a novel combination of reinforcement learning and classification methods to efficiently tackle the problems inherent to this type of systems. Preliminary results demonstrate the suitability of this approach.

  11. Current and anticipated uses of thermal-hydraulic codes in NFI

    Energy Technology Data Exchange (ETDEWEB)

    Tsuda, K. [Nuclear Fuel Industries, Ltd., Tokyo (Japan); Takayasu, M. [Nuclear Fuel Industries, Ltd., Sennann-gun (Japan)

    1997-07-01

    This paper presents the thermal-hydraulic codes currently used in NFI for the LWR fuel development and licensing application including transient and design basis accident analyses of LWR plants. The current status of the codes are described in the context of code capability, modeling feature, and experience of code application related to the fuel development and licensing. Finally, the anticipated use of the future thermal-hydraulic code in NFI is briefly given.

  12. Learning of spatio-temporal codes in a coupled oscillator system.

    Science.gov (United States)

    Orosz, Gábor; Ashwin, Peter; Townley, Stuart

    2009-07-01

    In this paper, we consider a learning strategy that allows one to transmit information between two coupled phase oscillator systems (called teaching and learning systems) via frequency adaptation. The dynamics of these systems can be modeled with reference to a number of partially synchronized cluster states and transitions between them. Forcing the teaching system by steady but spatially nonhomogeneous inputs produces cyclic sequences of transitions between the cluster states, that is, information about inputs is encoded via a "winnerless competition" process into spatio-temporal codes. The large variety of codes can be learned by the learning system that adapts its frequencies to those of the teaching system. We visualize the dynamics using "weighted order parameters (WOPs)" that are analogous to "local field potentials" in neural systems. Since spatio-temporal coding is a mechanism that appears in olfactory systems, the developed learning rules may help to extract information from these neural ensembles.

  13. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan; Cui, Xuefeng; Yu, Ge; Guo, Lili; Gao, Xin

    2017-01-01

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays

  14. A numerical similarity approach for using retired Current Procedural Terminology (CPT) codes for electronic phenotyping in the Scalable Collaborative Infrastructure for a Learning Health System (SCILHS).

    Science.gov (United States)

    Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N

    2015-12-11

    Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer

  15. Real-time Color Codes for Assessing Learning Process

    OpenAIRE

    Dzelzkalēja, L; Kapenieks, J

    2016-01-01

    Effective assessment is an important way for improving the learning process. There are existing guidelines for assessing the learning process, but they lack holistic digital knowledge society considerations. In this paper the authors propose a method for real-time evaluation of students’ learning process and, consequently, for quality evaluation of teaching materials both in the classroom and in the distance learning environment. The main idea of the proposed Color code method (CCM) is to use...

  16. On A Nonlinear Generalization of Sparse Coding and Dictionary Learning.

    Science.gov (United States)

    Xie, Yuchen; Ho, Jeffrey; Vemuri, Baba

    2013-01-01

    Existing dictionary learning algorithms are based on the assumption that the data are vectors in an Euclidean vector space ℝ d , and the dictionary is learned from the training data using the vector space structure of ℝ d and its Euclidean L 2 -metric. However, in many applications, features and data often originated from a Riemannian manifold that does not support a global linear (vector space) structure. Furthermore, the extrinsic viewpoint of existing dictionary learning algorithms becomes inappropriate for modeling and incorporating the intrinsic geometry of the manifold that is potentially important and critical to the application. This paper proposes a novel framework for sparse coding and dictionary learning for data on a Riemannian manifold, and it shows that the existing sparse coding and dictionary learning methods can be considered as special (Euclidean) cases of the more general framework proposed here. We show that both the dictionary and sparse coding can be effectively computed for several important classes of Riemannian manifolds, and we validate the proposed method using two well-known classification problems in computer vision and medical imaging analysis.

  17. Blending Classroom Teaching and Learning with QR Codes

    Science.gov (United States)

    Rikala, Jenni; Kankaanranta, Marja

    2014-01-01

    The aim of this case study was to explore the feasibility of the Quick Response (QR) codes and mobile devices in the context of Finnish basic education. The interest was especially to explore how mobile devices and QR codes can enhance and blend teaching and learning. The data were collected with a teacher interview and pupil surveys. The learning…

  18. Direct calculation of current drive efficiency in FISIC code

    International Nuclear Information System (INIS)

    Wright, J.C.; Phillips, C.K.; Bonoli, P.T.

    1996-01-01

    Two-dimensional RF modeling codes use a parameterization (1) of current drive efficiencies to calculate fast wave driven currents. This parameterization assumes a uniform quasi-linear diffusion coefficient and requires a priori knowledge of the wave polarizations. These difficulties may be avoided by a direct calculation of the quasilinear diffusion coefficient from the Kennel-Englemann form with the field polarizations calculated by the full wave code, FISIC (2). Current profiles are calculated using the adjoint formulation (3). Comparisons between the two formulations are presented. copyright 1996 American Institute of Physics

  19. A computational study on altered theta-gamma coupling during learning and phase coding.

    Directory of Open Access Journals (Sweden)

    Xuejuan Zhang

    Full Text Available There is considerable interest in the role of coupling between theta and gamma oscillations in the brain in the context of learning and memory. Here we have used a neural network model which is capable of producing coupling of theta phase to gamma amplitude firstly to explore its ability to reproduce reported learning changes and secondly to memory-span and phase coding effects. The spiking neural network incorporates two kinetically different GABA(A receptor-mediated currents to generate both theta and gamma rhythms and we have found that by selective alteration of both NMDA receptors and GABA(A,slow receptors it can reproduce learning-related changes in the strength of coupling between theta and gamma either with or without coincident changes in theta amplitude. When the model was used to explore the relationship between theta and gamma oscillations, working memory capacity and phase coding it showed that the potential storage capacity of short term memories, in terms of nested gamma-subcycles, coincides with the maximal theta power. Increasing theta power is also related to the precision of theta phase which functions as a potential timing clock for neuronal firing in the cortex or hippocampus.

  20. Using QR Codes to Differentiate Learning for Gifted and Talented Students

    Science.gov (United States)

    Siegle, Del

    2015-01-01

    QR codes are two-dimensional square patterns that are capable of coding information that ranges from web addresses to links to YouTube video. The codes save time typing and eliminate errors in entering addresses incorrectly. These codes make learning with technology easier for students and motivationally engage them in news ways.

  1. Current status of high energy nucleon-meson transport code

    Energy Technology Data Exchange (ETDEWEB)

    Takada, Hiroshi; Sasa, Toshinobu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    Current status of design code of accelerator (NMTC/JAERI code), outline of physical model and evaluation of accuracy of code were reported. To evaluate the nuclear performance of accelerator and strong spallation neutron origin, the nuclear reaction between high energy proton and target nuclide and behaviors of various produced particles are necessary. The nuclear design of spallation neutron system used a calculation code system connected the high energy nucleon{center_dot}meson transport code and the neutron{center_dot}photon transport code. NMTC/JAERI is described by the particle evaporation process under consideration of competition reaction of intranuclear cascade and fission process. Particle transport calculation was carried out for proton, neutron, {pi}- and {mu}-meson. To verify and improve accuracy of high energy nucleon-meson transport code, data of spallation and spallation neutron fragment by the integral experiment were collected. (S.Y.)

  2. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    Science.gov (United States)

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  3. Linear calculations of edge current driven kink modes with BOUT++ code

    Energy Technology Data Exchange (ETDEWEB)

    Li, G. Q., E-mail: ligq@ipp.ac.cn; Xia, T. Y. [Institute of Plasma Physics, CAS, Hefei, Anhui 230031 (China); Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Xu, X. Q. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Snyder, P. B.; Turnbull, A. D. [General Atomics, San Diego, California 92186 (United States); Ma, C. H.; Xi, P. W. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); FSC, School of Physics, Peking University, Beijing 100871 (China)

    2014-10-15

    This work extends previous BOUT++ work to systematically study the impact of edge current density on edge localized modes, and to benchmark with the GATO and ELITE codes. Using the CORSICA code, a set of equilibria was generated with different edge current densities by keeping total current and pressure profile fixed. Based on these equilibria, the effects of the edge current density on the MHD instabilities were studied with the 3-field BOUT++ code. For the linear calculations, with increasing edge current density, the dominant modes are changed from intermediate-n and high-n ballooning modes to low-n kink modes, and the linear growth rate becomes smaller. The edge current provides stabilizing effects on ballooning modes due to the increase of local shear at the outer mid-plane with the edge current. For edge kink modes, however, the edge current does not always provide a destabilizing effect; with increasing edge current, the linear growth rate first increases, and then decreases. In benchmark calculations for BOUT++ against the linear results with the GATO and ELITE codes, the vacuum model has important effects on the edge kink mode calculations. By setting a realistic density profile and Spitzer resistivity profile in the vacuum region, the resistivity was found to have a destabilizing effect on both the kink mode and on the ballooning mode. With diamagnetic effects included, the intermediate-n and high-n ballooning modes can be totally stabilized for finite edge current density.

  4. Linear calculations of edge current driven kink modes with BOUT++ code

    International Nuclear Information System (INIS)

    Li, G. Q.; Xia, T. Y.; Xu, X. Q.; Snyder, P. B.; Turnbull, A. D.; Ma, C. H.; Xi, P. W.

    2014-01-01

    This work extends previous BOUT++ work to systematically study the impact of edge current density on edge localized modes, and to benchmark with the GATO and ELITE codes. Using the CORSICA code, a set of equilibria was generated with different edge current densities by keeping total current and pressure profile fixed. Based on these equilibria, the effects of the edge current density on the MHD instabilities were studied with the 3-field BOUT++ code. For the linear calculations, with increasing edge current density, the dominant modes are changed from intermediate-n and high-n ballooning modes to low-n kink modes, and the linear growth rate becomes smaller. The edge current provides stabilizing effects on ballooning modes due to the increase of local shear at the outer mid-plane with the edge current. For edge kink modes, however, the edge current does not always provide a destabilizing effect; with increasing edge current, the linear growth rate first increases, and then decreases. In benchmark calculations for BOUT++ against the linear results with the GATO and ELITE codes, the vacuum model has important effects on the edge kink mode calculations. By setting a realistic density profile and Spitzer resistivity profile in the vacuum region, the resistivity was found to have a destabilizing effect on both the kink mode and on the ballooning mode. With diamagnetic effects included, the intermediate-n and high-n ballooning modes can be totally stabilized for finite edge current density

  5. Abstract feature codes: The building blocks of the implicit learning system.

    Science.gov (United States)

    Eberhardt, Katharina; Esser, Sarah; Haider, Hilde

    2017-07-01

    According to the Theory of Event Coding (TEC; Hommel, Müsseler, Aschersleben, & Prinz, 2001), action and perception are represented in a shared format in the cognitive system by means of feature codes. In implicit sequence learning research, it is still common to make a conceptual difference between independent motor and perceptual sequences. This supposedly independent learning takes place in encapsulated modules (Keele, Ivry, Mayr, Hazeltine, & Heuer 2003) that process information along single dimensions. These dimensions have remained underspecified so far. It is especially not clear whether stimulus and response characteristics are processed in separate modules. Here, we suggest that feature dimensions as they are described in the TEC should be viewed as the basic content of modules of implicit learning. This means that the modules process all stimulus and response information related to certain feature dimensions of the perceptual environment. In 3 experiments, we investigated by means of a serial reaction time task the nature of the basic units of implicit learning. As a test case, we used stimulus location sequence learning. The results show that a stimulus location sequence and a response location sequence cannot be learned without interference (Experiment 2) unless one of the sequences can be coded via an alternative, nonspatial dimension (Experiment 3). These results support the notion that spatial location is one module of the implicit learning system and, consequently, that there are no separate processing units for stimulus versus response locations. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Software Quality and Security in Teachers' and Students' Codes When Learning a New Programming Language

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2015-09-01

    Full Text Available In recent years, schools (as well as universities have added cyber security to their computer science curricula. This topic is still new for most of the current teachers, who would normally have a standard computer science background. Therefore the teachers are trained and then teaching their students what they have just learned. In order to explore differences in both populations’ learning, we compared measures of software quality and security between high-school teachers and students. We collected 109 source files, written in Python by 18 teachers and 31 students, and engineered 32 features, based on common standards for software quality (PEP 8 and security (derived from CERT Secure Coding Standards. We use a multi-view, data-driven approach, by (a using hierarchical clustering to bottom-up partition the population into groups based on their code-related features and (b building a decision tree model that predicts whether a student or a teacher wrote a given code (resulting with a LOOCV kappa of 0.751. Overall, our findings suggest that the teachers’ codes have a better quality than the students’ – with a sub-group of the teachers, mostly males, demonstrate better coding than their peers and the students – and that the students’ codes are slightly better secured than the teachers’ codes (although both populations show very low security levels. The findings imply that teachers might benefit from their prior knowledge and experience, but also emphasize the lack of continuous involvement of some of the teachers with code-writing. Therefore, findings shed light on computer science teachers as lifelong learners. Findings also highlight the difference between quality and security in today’s programming paradigms. Implications for these findings are discussed.

  7. Teaching Qualitative Research: Experiential Learning in Group-Based Interviews and Coding Assignments

    Science.gov (United States)

    DeLyser, Dydia; Potter, Amy E.

    2013-01-01

    This article describes experiential-learning approaches to conveying the work and rewards involved in qualitative research. Seminar students interviewed one another, transcribed or took notes on those interviews, shared those materials to create a set of empirical materials for coding, developed coding schemes, and coded the materials using those…

  8. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Science.gov (United States)

    Zhang, Ai-bing; Feng, Jie; Ward, Robert D; Wan, Ping; Gao, Qiang; Wu, Jun; Zhao, Wei-zhong

    2012-01-01

    Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI) region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS) genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF) to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish) and two representing non-coding ITS barcodes (rust fungi and brown algae). Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ) and Maximum likelihood (ML) methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI) of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40%) for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37%) for 1094 brown algae queries, both using ITS barcodes.

  9. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Directory of Open Access Journals (Sweden)

    Ai-bing Zhang

    Full Text Available Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish and two representing non-coding ITS barcodes (rust fungi and brown algae. Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ and Maximum likelihood (ML methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40% for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37% for 1094 brown algae queries, both using ITS barcodes.

  10. Evaluating QR Code Case Studies Using a Mobile Learning Framework

    Science.gov (United States)

    Rikala, Jenni

    2014-01-01

    The aim of this study was to evaluate the feasibility of Quick Response (QR) codes and mobile devices in the context of Finnish basic education. The feasibility was analyzed through a mobile learning framework, which includes the core characteristics of mobile learning. The study is part of a larger research where the aim is to develop a…

  11. Development of Learning Management in Moral Ethics and Code of Ethics of the Teaching Profession Course

    Science.gov (United States)

    Boonsong, S.; Siharak, S.; Srikanok, V.

    2018-02-01

    The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.

  12. Learning to Estimate Dynamical State with Probabilistic Population Codes.

    Directory of Open Access Journals (Sweden)

    Joseph G Makin

    2015-11-01

    Full Text Available Tracking moving objects, including one's own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF, the parameters of which can be learned via latent-variable density estimation (the EM algorithm. The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, "probabilistic population codes." We show that a recurrent neural network-a modified form of an exponential family harmonium (EFH-that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states.

  13. Analysis of Memory Codes and Cumulative Rehearsal in Observational Learning

    Science.gov (United States)

    Bandura, Albert; And Others

    1974-01-01

    The present study examined the influence of memory codes varying in meaningfulness and retrievability and cumulative rehearsal on retention of observationally learned responses over increasing temporal intervals. (Editor)

  14. Quick Response (QR) Codes for Audio Support in Foreign Language Learning

    Science.gov (United States)

    Vigil, Kathleen Murray

    2017-01-01

    This study explored the potential benefits and barriers of using quick response (QR) codes as a means by which to provide audio materials to middle-school students learning Spanish as a foreign language. Eleven teachers of Spanish to middle-school students created transmedia materials containing QR codes linking to audio resources. Students…

  15. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  16. Current and anticipated uses of thermal hydraulic codes in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyung-Doo; Chang, Won-Pyo [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1997-07-01

    In Korea, the current uses of thermal hydraulic codes are categorized into 3 areas. The first application is in designing both nuclear fuel and NSSS. The codes have usually been introduced based on the technology transfer programs agreed between KAERI and the foreign vendors. Another area is in the supporting of the plant operations and licensing by the utility. The third category is research purposes. In this area assessments and some applications to the safety issue resolutions are major activities using the best estimate thermal hydraulic codes such as RELAP5/MOD3 and CATHARE2. Recently KEPCO plans to couple thermal hydraulic codes with a neutronics code for the design of the evolutionary type reactor by 2004. KAERI also plans to develop its own best estimate thermal hydraulic code, however, application range is different from KEPCO developing code. Considering these activities, it is anticipated that use of the best estimate hydraulic analysis code developed in Korea may be possible in the area of safety evaluation within 10 years.

  17. Supporting Situated Learning Based on QR Codes with Etiquetar App: A Pilot Study

    Science.gov (United States)

    Camacho, Miguel Olmedo; Pérez-Sanagustín, Mar; Alario-Hoyos, Carlos; Soldani, Xavier; Kloos, Carlos Delgado; Sayago, Sergio

    2014-01-01

    EtiquetAR is an authoring tool for supporting the design and enactment of situated learning experiences based on QR tags. Practitioners use etiquetAR for creating, managing and personalizing collections of QR codes with special properties: (1) codes can have more than one link pointing at different multimedia resources, (2) codes can be updated…

  18. Predictive codes of familiarity and context during the perceptual learning of facial identities

    Science.gov (United States)

    Apps, Matthew A. J.; Tsakiris, Manos

    2013-11-01

    Face recognition is a key component of successful social behaviour. However, the computational processes that underpin perceptual learning and recognition as faces transition from unfamiliar to familiar are poorly understood. In predictive coding, learning occurs through prediction errors that update stimulus familiarity, but recognition is a function of both stimulus and contextual familiarity. Here we show that behavioural responses on a two-option face recognition task can be predicted by the level of contextual and facial familiarity in a computational model derived from predictive-coding principles. Using fMRI, we show that activity in the superior temporal sulcus varies with the contextual familiarity in the model, whereas activity in the fusiform face area covaries with the prediction error parameter that updated facial familiarity. Our results characterize the key computations underpinning the perceptual learning of faces, highlighting that the functional properties of face-processing areas conform to the principles of predictive coding.

  19. Report on the Current Technical Issues on ASME Nuclear Code and Standard

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2008-11-01

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  20. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    Science.gov (United States)

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  1. Current and anticipated uses of the CATHARE code at EDF and FRAMATOME

    Energy Technology Data Exchange (ETDEWEB)

    Gandrille, J.L.; Vacher, J.L.; Poizat, F.

    1997-07-01

    This paper presents current industrial applications of the CATHARE code in the fields of Safety Studies and Simulators where the code is intensively used by FRAMATOME, EDF and CEA, the development partners of CATHARE. Future needs in these fields are also recapitulated.

  2. Current and anticipated uses of the CATHARE code at EDF and FRAMATOME

    International Nuclear Information System (INIS)

    Gandrille, J.L.; Vacher, J.L.; Poizat, F.

    1997-01-01

    This paper presents current industrial applications of the CATHARE code in the fields of Safety Studies and Simulators where the code is intensively used by FRAMATOME, EDF and CEA, the development partners of CATHARE. Future needs in these fields are also recapitulated

  3. Current and anticipated use of thermal-hydraulic codes for BWR transient and accident analyses in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Arai, Kenji; Ebata, Shigeo [Toshiba Corp., Yokohama (Japan)

    1997-07-01

    This paper summarizes the current and anticipated use of the thermal-hydraulic and neutronic codes for the BWR transient and accident analyses in Japan. The codes may be categorized into the licensing codes and the best estimate codes for the BWR transient and accident analyses. Most of the licensing codes have been originally developed by General Electric. Some codes have been updated based on the technical knowledge obtained in the thermal hydraulic study in Japan, and according to the BWR design changes. The best estimates codes have been used to support the licensing calculations and to obtain the phenomenological understanding of the thermal hydraulic phenomena during a BWR transient or accident. The best estimate codes can be also applied to a design study for a next generation BWR to which the current licensing model may not be directly applied. In order to rationalize the margin included in the current BWR design and develop a next generation reactor with appropriate design margin, it will be required to improve the accuracy of the thermal-hydraulic and neutronic model. In addition, regarding the current best estimate codes, the improvement in the user interface and the numerics will be needed.

  4. Code-Switching Functions in Modern Hebrew Teaching and Learning

    Science.gov (United States)

    Gilead, Yona

    2016-01-01

    The teaching and learning of Modern Hebrew outside of Israel is essential to Jewish education and identity. One of the most contested issues in Modern Hebrew pedagogy is the use of code-switching between Modern Hebrew and learners' first language. Moreover, this is one of the longest running disputes in the broader field of second language…

  5. Machine-learning-assisted correction of correlated qubit errors in a topological code

    Directory of Open Access Journals (Sweden)

    Paul Baireuther

    2018-01-01

    Full Text Available A fault-tolerant quantum computation requires an efficient means to detect and correct errors that accumulate in encoded quantum information. In the context of machine learning, neural networks are a promising new approach to quantum error correction. Here we show that a recurrent neural network can be trained, using only experimentally accessible data, to detect errors in a widely used topological code, the surface code, with a performance above that of the established minimum-weight perfect matching (or blossom decoder. The performance gain is achieved because the neural network decoder can detect correlations between bit-flip (X and phase-flip (Z errors. The machine learning algorithm adapts to the physical system, hence no noise model is needed. The long short-term memory layers of the recurrent neural network maintain their performance over a large number of quantum error correction cycles, making it a practical decoder for forthcoming experimental realizations of the surface code.

  6. Imitation Learning Based on an Intrinsic Motivation Mechanism for Efficient Coding

    Directory of Open Access Journals (Sweden)

    Jochen eTriesch

    2013-11-01

    Full Text Available A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation.

  7. Deep Learning Methods for Improved Decoding of Linear Codes

    Science.gov (United States)

    Nachmani, Eliya; Marciano, Elad; Lugosch, Loren; Gross, Warren J.; Burshtein, David; Be'ery, Yair

    2018-02-01

    The problem of low complexity, close to optimal, channel decoding of linear codes with short to moderate block length is considered. It is shown that deep learning methods can be used to improve a standard belief propagation decoder, despite the large example space. Similar improvements are obtained for the min-sum algorithm. It is also shown that tying the parameters of the decoders across iterations, so as to form a recurrent neural network architecture, can be implemented with comparable results. The advantage is that significantly less parameters are required. We also introduce a recurrent neural decoder architecture based on the method of successive relaxation. Improvements over standard belief propagation are also observed on sparser Tanner graph representations of the codes. Furthermore, we demonstrate that the neural belief propagation decoder can be used to improve the performance, or alternatively reduce the computational complexity, of a close to optimal decoder of short BCH codes.

  8. Current status of the transient integral fuel element performance code URANUS

    International Nuclear Information System (INIS)

    Preusser, T.; Lassmann, K.

    1983-01-01

    To investigate the behavior of fuel pins during normal and off-normal operation, the integral fuel rod code URANUS has been extended to include a transient version. The paper describes the current status of the program system including a presentation of newly developed models for hypothetical accident investigation. The main objective of current development work is to improve the modelling of fuel and clad material behavior during fast transients. URANUS allows detailed analysis of experiments until the onset of strong material transport phenomena. Transient fission gas analysis is carried out due to the coupling with a special version of the LANGZEIT-KURZZEIT-code (KfK). Fuel restructuring and grain growth kinetics models have been improved recently to better characterize pre-experimental steady-state operation; transient models are under development. Extensive verification of the new version has been carried out by comparison with analytical solutions, experimental evidence, and code-to-code evaluation studies. URANUS, with all these improvements, has been successfully applied to difficult fast breeder fuel rod analysis including TOP, LOF, TUCOP, local coolant blockage and specific carbide fuel experiments. Objective of further studies is the description of transient PCMI. It is expected that the results of these developments will contribute significantly to the understanding of fuel element structural behavior during severe transients. (orig.)

  9. Representing high-dimensional data to intelligent prostheses and other wearable assistive robots: A first comparison of tile coding and selective Kanerva coding.

    Science.gov (United States)

    Travnik, Jaden B; Pilarski, Patrick M

    2017-07-01

    Prosthetic devices have advanced in their capabilities and in the number and type of sensors included in their design. As the space of sensorimotor data available to a conventional or machine learning prosthetic control system increases in dimensionality and complexity, it becomes increasingly important that this data be represented in a useful and computationally efficient way. Well structured sensory data allows prosthetic control systems to make informed, appropriate control decisions. In this study, we explore the impact that increased sensorimotor information has on current machine learning prosthetic control approaches. Specifically, we examine the effect that high-dimensional sensory data has on the computation time and prediction performance of a true-online temporal-difference learning prediction method as embedded within a resource-limited upper-limb prosthesis control system. We present results comparing tile coding, the dominant linear representation for real-time prosthetic machine learning, with a newly proposed modification to Kanerva coding that we call selective Kanerva coding. In addition to showing promising results for selective Kanerva coding, our results confirm potential limitations to tile coding as the number of sensory input dimensions increases. To our knowledge, this study is the first to explicitly examine representations for realtime machine learning prosthetic devices in general terms. This work therefore provides an important step towards forming an efficient prosthesis-eye view of the world, wherein prompt and accurate representations of high-dimensional data may be provided to machine learning control systems within artificial limbs and other assistive rehabilitation technologies.

  10. ASME nuclear codes and standards: Scope of coverage and current initiatives

    International Nuclear Information System (INIS)

    Eisenberg, G. M.

    1995-01-01

    The objective of this paper is to address the broad scope of coverage of nuclear codes, standards and guides produced and administered by the American Society of Mechanical Engineers (ASME). Background information is provided regarding the evolution of the present activities. Details are provided on current initiatives intended to permit ASME to meet the needs of a changing nuclear industry on a worldwide scale. During the early years of commercial nuclear power, ASME produced a code for the construction of nuclear vessels used in the reactor coolant pressure boundary, containment and auxiliary systems. In response to industry growth, ASME Code coverage soon broadened to include rules for construction of other nuclear components, and inservice inspection of nuclear reactor coolant systems. In the years following this, the scope of ASME nuclear codes, standards and guides has been broadened significantly to include air cleaning activities for nuclear power reactors, operation and maintenance of nuclear power plants, quality assurance programs, cranes for nuclear facilities, qualification of mechanical equipment, and concrete reactor vessels and containments. ASME focuses on globalization of its codes, standards and guides by encouraging and promoting their use in the international community and by actively seeking participation of international members on its technical and supervisory committees and in accreditation activities. Details are provided on current international representation. Initiatives are underway to separate the technical requirements from administrative and enforcement requirements, to convert to hard metric units, to provide for non-U. S. materials, and to provide for translations into non-English languages. ASME activity as an accredited ISO 9000 registrar for suppliers of mechanical equipment is described. Rules are being developed for construction of containment systems for nuclear spent fuel and high-level waste transport packagings. Intensive

  11. Current and anticipated uses of thermal-hydraulic codes in Spain

    Energy Technology Data Exchange (ETDEWEB)

    Pelayo, F.; Reventos, F. [Consejo de Seguridad Nuclear, Barcelona (Spain)

    1997-07-01

    Spanish activities in the field of Applied Thermal-Hydraulics are steadily increasing as the codes are becoming practicable enough to efficiently sustain engineering decision in the Nuclear Power industry. Before reaching this point, a lot of effort has been devoted to achieve this goal. This paper briefly describes this process, points at the current applications and draws conclusions on the limitations. Finally it establishes the applications where the use of T-H codes would be worth in the future, this in turn implies further development of the codes to widen the scope of application and improve the general performance. Due to the different uses of the codes, the applications mainly come from the authority, industry, universities and research institutions. The main conclusion derived from this paper establishes that further code development is justified if the following requisites are considered: (1) Safety relevance of scenarios not presently covered is established. (2) A substantial gain in margins or the capability to use realistic assumptions is obtained. (3) A general consensus on the licensability and methodology for application is reached. The role of Regulatory Body is stressed, as the most relevant outcome of the project may be related to the evolution of the licensing frame.

  12. Remembering to learn: independent place and journey coding mechanisms contribute to memory transfer.

    Science.gov (United States)

    Bahar, Amir S; Shapiro, Matthew L

    2012-02-08

    The neural mechanisms that integrate new episodes with established memories are unknown. When rats explore an environment, CA1 cells fire in place fields that indicate locations. In goal-directed spatial memory tasks, some place fields differentiate behavioral histories ("journey-dependent" place fields) while others do not ("journey-independent" place fields). To investigate how these signals inform learning and memory for new and familiar episodes, we recorded CA1 and CA3 activity in rats trained to perform a "standard" spatial memory task in a plus maze and in two new task variants. A "switch" task exchanged the start and goal locations in the same environment; an "altered environment" task contained unfamiliar local and distal cues. In the switch task, performance was mildly impaired, new firing maps were stable, but the proportion and stability of journey-dependent place fields declined. In the altered environment, overall performance was strongly impaired, new firing maps were unstable, and stable proportions of journey-dependent place fields were maintained. In both tasks, memory errors were accompanied by a decline in journey codes. The different dynamics of place and journey coding suggest that they reflect separate mechanisms and contribute to distinct memory computations. Stable place fields may represent familiar relationships among environmental features that are required for consistent memory performance. Journey-dependent activity may correspond with goal-directed behavioral sequences that reflect expectancies that generalize across environments. The complementary signals could help link current events with established memories, so that familiarity with either a behavioral strategy or an environment can inform goal-directed learning.

  13. The use of QR Code as a learning technology: an exploratory study

    Directory of Open Access Journals (Sweden)

    Stefano Besana

    2010-12-01

    Full Text Available This paper discusses a pilot study on the potential benefits of QR (Quick Response Codes as a tool for facilitating and enhancing learning processes. An analysis is given of the strengths and added value of QR technologies applied to museum visits, with precautions regarding the design of learning environments like the one presented. Some possible future scenarios are identified for implementing these technologies in contexts more strictly related to teaching and education.

  14. International pressure vessels and piping codes and standards. Volume 2: Current perspectives; PVP-Volume 313-2

    International Nuclear Information System (INIS)

    Rao, K.R.; Asada, Yasuhide; Adams, T.M.

    1995-01-01

    The topics in this volume include: (1) Recent or imminent changes to Section 3 design sections; (2) Select perspectives of ASME Codes -- Section 3; (3) Select perspectives of Boiler and Pressure Vessel Codes -- an international outlook; (4) Select perspectives of Boiler and Pressure Vessel Codes -- ASME Code Sections 3, 8 and 11; (5) Codes and Standards Perspectives for Analysis; (6) Selected design perspectives on flow-accelerated corrosion and pressure vessel design and qualification; (7) Select Codes and Standards perspectives for design and operability; (8) Codes and Standards perspectives for operability; (9) What's new in the ASME Boiler and Pressure Vessel Code?; (10) A look at ongoing activities of ASME Sections 2 and 3; (11) A look at current activities of ASME Section 11; (12) A look at current activities of ASME Codes and Standards; (13) Simplified design methodology and design allowable stresses -- 1 and 2; (14) Introduction to Power Boilers, Section 1 of the ASME Code -- Part 1 and 2. Separate abstracts were prepared for most of the individual papers

  15. Current Trends in Higher Education Learning and Teaching

    Science.gov (United States)

    Singh, R. J.

    2012-01-01

    Current trends in higher education learning and teaching focuses on the use of technology, integrated learning through "blended learning" and writing for academic purposes. This introductory article initiates the debate around the context of South African higher education teaching and learning. It does so by contextualizing the South…

  16. Current status of the reactor physics code WIMS and recent developments

    International Nuclear Information System (INIS)

    Lindley, B.A.; Hosking, J.G.; Smith, P.J.; Powney, D.J.; Tollit, B.S.; Newton, T.D.; Perry, R.; Ware, T.C.; Smith, P.N.

    2017-01-01

    Highlights: • The current status of the WIMS reactor physics code is presented. • Applications range from 2D lattice calculations up to 3D whole core geometries. • Gamma transport and thermal-hydraulic feedback models added. • Calculations methodologies described for several Gen II, III and IV reactor types. - Abstract: The WIMS modular reactor physics code has been under continuous development for over fifty years. This paper discusses the current status of WIMS and recent developments, in particular developments to the resonance shielding methodology and 3D transport solvers. Traditionally, WIMS is used to perform 2D lattice calculations, typically to generate homogenized reactor physics parameters for a whole core code such as PANTHER. However, with increasing computational resources there has been a growing trend for performing transport calculations on larger problems, up to and including 3D full core models. To this end, a number of the WIMS modules have been parallelised to allow efficient performance for whole core calculations, and WIMS includes a 3D method of characteristics solver with reflective and once-through tracking methods, which can be used to analyse problems of varying size and complexity. A time-dependent flux solver has been incorporated and thermal-hydraulic modelling capability is also being added to allow steady-state and transient coupled calculations to be performed. WIMS has been validated against a range of experimental data and other codes, in particular for water and graphite moderated thermal reactors. Future developments will include improved parallelization, enhancing the thermal-hydraulic feedback models and validating the WIMS/PANTHER code system for BWRs and fast reactors.

  17. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer; Ghanem, Bernard

    2017-01-01

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images

  18. Stitching Codeable Circuits: High School Students' Learning About Circuitry and Coding with Electronic Textiles

    Science.gov (United States)

    Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.

    2017-10-01

    Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light outputs and examining how the two domains interact. We implemented an electronic textiles unit with 23 high school students ages 16-17 years who learned how to craft and code circuits with the LilyPad Arduino, an electronic textile construction kit. Our analyses not only confirm significant increases in students' understanding of functional circuits but also showcase students' ability in designing and remixing program code for controlling circuits. In our discussion, we address opportunities and challenges of introducing codeable circuit design for integrating maker activities that include engineering and computing into classrooms.

  19. Current Status of the Elevated Temperature Structure Design Codes for VHTR

    International Nuclear Information System (INIS)

    Kim, Jong-Bum; Kim, Seok-Hoon; Park, Keun-Bae; Lee, Won-Jae

    2006-01-01

    An elevated temperature structure design and analysis is one of the key issues in the VHTR (Very High Temperature Reactor) project to achieve an economic production of hydrogen which will be an essential energy source for the near future. Since the operating temperature of a VHTR is above 850 .deg. C, the existing code and standards are insufficient for a high temperature structure design. Thus the issues concerning a material selection and behaviors are being studied for the main structural components of a VHTR in leading countries such as US, France, UK, and Japan. In this study, the current status of the ASME code, French RCC-MR, UK R5, and Japanese code were investigated and the necessary R and D items were discussed

  20. The current status of cyanobacterial nomenclature under the "prokaryotic" and the "botanical" code.

    Science.gov (United States)

    Oren, Aharon; Ventura, Stefano

    2017-10-01

    Cyanobacterial taxonomy developed in the botanical world because Cyanobacteria/Cyanophyta have traditionally been identified as algae. However, they possess a prokaryotic cell structure, and phylogenetically they belong to the Bacteria. This caused nomenclature problems as the provisions of the International Code of Nomenclature for algae, fungi, and plants (ICN; the "Botanical Code") differ from those of the International Code of Nomenclature of Prokaryotes (ICNP; the "Prokaryotic Code"). While the ICN recognises names validly published under the ICNP, Article 45(1) of the ICN has not yet been reciprocated in the ICNP. Different solutions have been proposed to solve the current problems. In 2012 a Special Committee on the harmonisation of the nomenclature of Cyanobacteria was appointed, but its activity has been minimal. Two opposing proposals to regulate cyanobacterial nomenclature were recently submitted, one calling for deletion of the cyanobacteria from the groups of organisms whose nomenclature is regulated by the ICNP, the second to consistently apply the rules of the ICNP to all cyanobacteria. Following a general overview of the current status of cyanobacterial nomenclature under the two codes we present five case studies of genera for which nomenclatural aspects have been discussed in recent years: Microcystis, Planktothrix, Halothece, Gloeobacter and Nostoc.

  1. A commentary on the current status and the future role of the European accident code

    International Nuclear Information System (INIS)

    Butland, A.T.D.

    1990-01-01

    This paper describes the history of the project to produce the European Accident code (EAC), leading to the planned release of a version of EAC-2 at the end of 1989. The requirements of a computer code to model the initiation phase of Hypothetical Core Disruptive Accidents (HCDAs) are discussed, paying particular attention to the lessons learnt in the CABRI project. The current status and content of the EAC-2 code are examined in relation to these requirements, noting how the sophisticated modelling plans for EAC-2 make it a benchmark code. The validation status of EAC-2 and future plans are discussed, noting that currently it consists solely of stand-alone validation of the modules used in EAC-2, rather than validation of the combined code. The future role of EAC-2 is briefly discussed in relation to the fast reactor plans in the EEC countries. (author)

  2. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan; Gao, Xin

    2014-01-01

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  3. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  4. Accuracy comparison among different machine learning techniques for detecting malicious codes

    Science.gov (United States)

    Narang, Komal

    2016-03-01

    In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.

  5. Current issues in billing and coding in interventional pain medicine.

    Science.gov (United States)

    Manchikanti, L

    2000-10-01

    Interventional pain management is a dynamic field with changes occurring on a daily basis, not only with technology but also with regulations that have a substantial financial impact on practices. Regulations are imposed not only by the federal government and other regulatory agencies, and also by a multitude of other payors, state governments and medical boards. Documentation of medical necessity with coding that correlates with multiple components of the patient's medical record, operative report, and billing statement is extremely important. Numerous changes which have occurred in the practice of interventional pain management in the new millennium continue to impact the financial viability of interventional pain practices along with patient access to these services. Thus, while complying with regulations of billing, coding and proper, effective, and ethical practice of pain management, it is also essential for physicians to understand financial aspects and the impact of various practice patterns. This article provides guidelines which are meant to provide practical considerations for billing and coding of interventional techniques in the management of chronic pain based on the current state of the art and science of interventional pain management. Hence, these guidelines do not constitute inflexible treatment, coding, billing or documentation recommendations. It is expected that a provider will establish a plan of care on a case-by-case basis taking into account an individual patient's medical condition, personal needs, and preferences, along with physician's experience and in a similar manner, billing and coding practices will be developed. Based on an individual patient's needs, treatment, billing and coding, different from what is outlined here is not only warranted but essential.

  6. Code-switching as a communication, learning, and social negotiation stategy in first-year learners of Danish

    DEFF Research Database (Denmark)

    Arnfast, Juni Søderberg; Jørgensen, Jens Normann

    2003-01-01

    The term code-switching is used in two related, yet different fields of linguistics: Second Language Acquisition and bilinguals studies. In the former code-switching is analyzed in terms of learning strategies, whereas the latter applies the competence view.The present paper intends to detect the...

  7. Workshops and problems for benchmarking eddy current codes

    International Nuclear Information System (INIS)

    Turner, L.R.; Davey, K.; Ida, N.; Rodger, D.; Kameari, A.; Bossavit, A.; Emson, C.R.I.

    1988-02-01

    A series of six workshops was held to compare eddy current codes, using six benchmark problems. The problems include transient and steady-state ac magnetic fields, close and far boundary conditions, magnetic and non-magnetic materials. All the problems are based either on experiments or on geometries that can be solved analytically. The workshops and solutions to the problems are described. Results show that many different methods and formulations give satisfactory solutions, and that in many cases reduced dimensionality or coarse discretization can give acceptable results while reducing the computer time required. 13 refs., 1 tab

  8. U.S. Sodium Fast Reactor Codes and Methods: Current Capabilities and Path Forward

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, A. J.; Fanning, T. H.

    2017-06-26

    The United States has extensive experience with the design, construction, and operation of sodium cooled fast reactors (SFRs) over the last six decades. Despite the closure of various facilities, the U.S. continues to dedicate research and development (R&D) efforts to the design of innovative experimental, prototype, and commercial facilities. Accordingly, in support of the rich operating history and ongoing design efforts, the U.S. has been developing and maintaining a series of tools with capabilities that envelope all facets of SFR design and safety analyses. This paper provides an overview of the current U.S. SFR analysis toolset, including codes such as SAS4A/SASSYS-1, MC2-3, SE2-ANL, PERSENT, NUBOW-3D, and LIFE-METAL, as well as the higher-fidelity tools (e.g. PROTEUS) being integrated into the toolset. Current capabilities of the codes are described and key ongoing development efforts are highlighted for some codes.

  9. Analysis of the Current Technical Issues on ASME Code and Standard for Nuclear Mechanical Design(2009)

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2009-11-01

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  10. Analysis of the Current Technical Issues on ASME Code and Standard for Nuclear Mechanical Design(2009)

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Gyeong Hoi; Lee, B. S.; Yoo, S. H.

    2009-11-15

    This report describes the analysis on the current revision movement related to the mechanical design issues of the U.S ASME nuclear code and standard. ASME nuclear mechanical design in this report is composed of the nuclear material, primary system, secondary system and high temperature reactor. This report includes the countermeasures based on the ASME Code meeting for current issues of each major field. KAMC(ASME Mirror Committee) of this project is willing to reflect a standpoint of the domestic nuclear industry on ASME nuclear mechanical design and play a technical bridge role for the domestic nuclear industry in ASME Codes application

  11. Simulations of vertical disruptions with VDE code: Hiro and Evans currents

    Science.gov (United States)

    Li, Xujing; Di Hu Team; Leonid Zakharov Team; Galkin Team

    2014-10-01

    The recently created numerical code VDE for simulations of vertical instability in tokamaks is presented. The numerical scheme uses the Tokamak MHD model, where the plasma inertia is replaced by the friction force, and an adaptive grid numerical scheme. The code reproduces well the surface currents generated at the plasma boundary by the instability. Five regimes of the vertical instability are presented: (1) Vertical instability in a given plasma shaping field without a wall; (2) The same with a wall and magnetic flux ΔΨ|plX< ΔΨ|Xwall(where X corresponds to the X-point of a separatrix); (3) The same with a wall and magnetic flux ΔΨ|plX> ΔΨ|Xwall; (4) Vertical instability without a wall with a tile surface at the plasma path; (5) The same in the presence of a wall and a tile surface. The generation of negative Hiro currents along the tile surface, predicted earlier by the theory and measured on EAST in 2012, is well-reproduced by simulations. In addition, the instability generates the force-free Evans currents at the free plasma surface. The new pattern of reconnection of the plasma with the vacuum magnetic field is discovered. This work is supported by US DoE Contract No. DE-AC02-09-CH11466.

  12. DeepNet: An Ultrafast Neural Learning Code for Seismic Imaging

    International Nuclear Information System (INIS)

    Barhen, J.; Protopopescu, V.; Reister, D.

    1999-01-01

    A feed-forward multilayer neural net is trained to learn the correspondence between seismic data and well logs. The introduction of a virtual input layer, connected to the nominal input layer through a special nonlinear transfer function, enables ultrafast (single iteration), near-optimal training of the net using numerical algebraic techniques. A unique computer code, named DeepNet, has been developed, that has achieved, in actual field demonstrations, results unattainable to date with industry standard tools

  13. Benchmarking of codes for electron cyclotron heating and electron cyclotron current drive under ITER conditions

    NARCIS (Netherlands)

    Prater, R.; Farina, D.; Gribov, Y.; Harvey, R. W.; Ram, A. K.; Lin-Liu, Y. R.; Poli, E.; Smirnov, A. P.; Volpe, F.; Westerhof, E.; Zvonkovo, A.

    2008-01-01

    Optimal design and use of electron cyclotron heating requires that accurate and relatively quick computer codes be available for prediction of wave coupling, propagation, damping and current drive at realistic levels of EC power. To this end, a number of codes have been developed in laboratories

  14. The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding.

    Science.gov (United States)

    Testolin, Alberto; De Filippo De Grazia, Michele; Zorzi, Marco

    2017-01-01

    The recent "deep learning revolution" in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, because it emphasizes the role of generative learning as opposed to discriminative (supervised) learning. As a case study, we present a series of simulations investigating the emergence of neural coding of visual space for sensorimotor transformations. We compare different network architectures commonly used as building blocks for unsupervised deep learning by systematically testing the type of receptive fields and gain modulation developed by the hidden neurons. In particular, we compare Restricted Boltzmann Machines (RBMs), which are stochastic, generative networks with bidirectional connections trained using contrastive divergence, with autoencoders, which are deterministic networks trained using error backpropagation. For both learning architectures we also explore the role of sparse coding, which has been identified as a fundamental principle of neural computation. The unsupervised models are then compared with supervised, feed-forward networks that learn an explicit mapping between different spatial reference frames. Our simulations show that both architectural and learning constraints strongly influenced the emergent coding of visual space in terms of distribution of tuning functions at the level of single neurons. Unsupervised models, and particularly RBMs, were found to more closely adhere to neurophysiological data from single-cell recordings in the primate parietal cortex. These results provide new insights into how basic properties of artificial neural networks might be relevant for modeling neural information processing in biological systems.

  15. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer

    2017-12-25

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images independently. However, learning multidimensional dictionaries and sparse codes for the reconstruction of multi-dimensional data is very important, as it examines correlations among all the data jointly. This provides more capacity for the learned dictionaries to better reconstruct data. In this paper, we propose a generic and novel formulation for the CSC problem that can handle an arbitrary order tensor of data. Backed with experimental results, our proposed formulation can not only tackle applications that are not possible with standard CSC solvers, including colored video reconstruction (5D- tensors), but it also performs favorably in reconstruction with much fewer parameters as compared to naive extensions of standard CSC to multiple features/channels.

  16. Using supervised machine learning to code policy issues: Can classifiers generalize across contexts?

    NARCIS (Netherlands)

    Burscher, B.; Vliegenthart, R.; de Vreese, C.H.

    2015-01-01

    Content analysis of political communication usually covers large amounts of material and makes the study of dynamics in issue salience a costly enterprise. In this article, we present a supervised machine learning approach for the automatic coding of policy issues, which we apply to news articles

  17. Grammar Coding in the "Oxford Advanced Learner's Dictionary of Current English."

    Science.gov (United States)

    Wekker, Herman

    1992-01-01

    Focuses on the revised system of grammar coding for verbs in the fourth edition of the "Oxford Advanced Learner's Dictionary of Current English" (OALD4), comparing it with two other similar dictionaries. It is shown that the OALD4 is found to be more favorable on many criteria than the other comparable dictionaries. (16 references) (VWL)

  18. Side Information and Noise Learning for Distributed Video Coding using Optical Flow and Clustering

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The coding efficiency of DVC critically depends on the quality of side information generation and accuracy of noise modeling. This paper considers...... Transform Domain Wyner-Ziv (TDWZ) coding and proposes using optical flow to improve side information generation and clustering to improve noise modeling. The optical flow technique is exploited at the decoder side to compensate weaknesses of block based methods, when using motion-compensation to generate...... side information frames. Clustering is introduced to capture cross band correlation and increase local adaptivity in the noise modeling. This paper also proposes techniques to learn from previously decoded (WZ) frames. Different techniques are combined by calculating a number of candidate soft side...

  19. Code to Learn: Where Does It Belong in the K-12 Curriculum?

    Science.gov (United States)

    Moreno-León, Jesús; Robles, Gregorio; Román-González, Marcos

    2016-01-01

    The introduction of computer programming in K-12 has become mainstream in the last years, as countries around the world are making coding part of their curriculum. Nevertheless, there is a lack of empirical studies that investigate how learning to program at an early age affects other school subjects. In this regard, this paper compares three…

  20. Predictive coding accelerates word recognition and learning in the early stages of language development.

    Science.gov (United States)

    Ylinen, Sari; Bosseler, Alexis; Junttila, Katja; Huotilainen, Minna

    2017-11-01

    The ability to predict future events in the environment and learn from them is a fundamental component of adaptive behavior across species. Here we propose that inferring predictions facilitates speech processing and word learning in the early stages of language development. Twelve- and 24-month olds' electrophysiological brain responses to heard syllables are faster and more robust when the preceding word context predicts the ending of a familiar word. For unfamiliar, novel word forms, however, word-expectancy violation generates a prediction error response, the strength of which significantly correlates with children's vocabulary scores at 12 months. These results suggest that predictive coding may accelerate word recognition and support early learning of novel words, including not only the learning of heard word forms but also their mapping to meanings. Prediction error may mediate learning via attention, since infants' attention allocation to the entire learning situation in natural environments could account for the link between prediction error and the understanding of word meanings. On the whole, the present results on predictive coding support the view that principles of brain function reported across domains in humans and non-human animals apply to language and its development in the infant brain. A video abstract of this article can be viewed at: http://hy.fi/unitube/video/e1cbb495-41d8-462e-8660-0864a1abd02c. [Correction added on 27 January 2017, after first online publication: The video abstract link was added.]. © 2016 John Wiley & Sons Ltd.

  1. Structural Learning Theory: Current Status and New Perspectives.

    Science.gov (United States)

    Scandura, Joseph M.

    2001-01-01

    Presents the current status and new perspectives on the Structured Learning Theory (SLT), with special consideration given to how SLT has been influenced by recent research in software engineering. Topics include theoretical constructs; content domains; structural analysis; cognition; assessing behavior potential; and teaching and learning issues,…

  2. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  3. OBJECT OF THE CONTRACT FROM THE PERSPECTIVE OF THE CURRENT CIVIL CODE

    Directory of Open Access Journals (Sweden)

    Raluca Antoanetta TOMESCU

    2017-07-01

    Full Text Available An indispensable element of social relations, primarily the contract governs our existence. Virtually anything in our lives is governed by contracts. Any move we make, school, work performed, marriage, holidays, a house or a new car, will lead to the acceptance of a contract, or are a consequence of their existence. In the light of the codifications set forth in the current Civil Code, which regularly follows the modern proposals for contract rules, the legislator gives us a clear perspective on its essential conditions of validity. Thus, along with the ability to contract and the consent of the parties, as essential conditions of validity of the contract, the cause and object of the contract also arise. The purpose of this study is therefore to reflect upon the meaning of some terms such as "the object of the contract", "the object of the obligation" or "the object of the benefit" in agreement with the regulations contained in the current Civil Code, especially because in practice but sometimes also in legal doctrine, sufficient attention is not given to the legal sense of each of them, the current rule bringing clarifying regulations.

  4. Extracorporeal membrane oxygenation: current clinical practice, coding, and reimbursement.

    Science.gov (United States)

    Schuerer, Douglas J E; Kolovos, Nikoleta S; Boyd, Kayla V; Coopersmith, Craig M

    2008-07-01

    Extracorporeal membrane oxygenation (ECMO) is a technique for providing life support for patients experiencing both pulmonary and cardiac failure by maintaining oxygenation and perfusion until native organ function is restored. ECMO is used routinely at many specialized hospitals for infants and less commonly for children with respiratory or cardiac failure from a variety of causes. Its usage is more controversial in adults, but select medical centers have reported favorable findings in patients with ARDS and other causes of severe pulmonary failure. ECMO is also rarely used as a rescue therapy in a small subset of adult patients with cardiac failure. This article will review the current uses and techniques of ECMO in the critical care setting as well as the evidence supporting its usage. In addition, current practice management related to coding and reimbursement for this intensive therapy will be discussed.

  5. Two dimensional code for modeling of high ione cyclotron harmonic fast wave heating and current drive

    International Nuclear Information System (INIS)

    Grekov, D.; Kasilov, S.; Kernbichler, W.

    2016-01-01

    A two dimensional numerical code for computation of the electromagnetic field of a fast magnetosonic wave in a tokamak at high harmonics of the ion cyclotron frequency has been developed. The code computes the finite difference solution of Maxwell equations for separate toroidal harmonics making use of the toroidal symmetry of tokamak plasmas. The proper boundary conditions are prescribed at the realistic tokamak vessel. The currents in the RF antenna are specified externally and then used in Ampere law. The main poloidal tokamak magnetic field and the ''kinetic'' part of the dielectric permeability tensor are treated iteratively. The code has been verified against known analytical solutions and first calculations of current drive in the spherical torus are presented.

  6. Evolving a Dynamic Predictive Coding Mechanism for Novelty Detection

    OpenAIRE

    Haggett, Simon J.; Chu, Dominique; Marshall, Ian W.

    2007-01-01

    Novelty detection is a machine learning technique which identifies new or unknown information in data sets. We present our current work on the construction of a new novelty detector based on a dynamical version of predictive coding. We compare three evolutionary algorithms, a simple genetic algorithm, NEAT and FS-NEAT, for the task of optimising the structure of an illustrative dynamic predictive coding neural network to improve its performance over stimuli from a number of artificially gener...

  7. Effects of Mode of Target Task Selection on Learning about Plants in a Mobile Learning Environment: Effortful Manual Selection versus Effortless QR-Code Selection

    Science.gov (United States)

    Gao, Yuan; Liu, Tzu-Chien; Paas, Fred

    2016-01-01

    This study compared the effects of effortless selection of target plants using quick respond (QR) code technology to effortful manual search and selection of target plants on learning about plants in a mobile device supported learning environment. In addition, it was investigated whether the effectiveness of the 2 selection methods was…

  8. Impact of School Uniforms on Student Discipline and the Learning Climate: A Comparative Case Study of Two Middle Schools with Uniform Dress Codes and Two Middle Schools without Uniform Dress Codes

    Science.gov (United States)

    Dulin, Charles Dewitt

    2016-01-01

    The purpose of this research is to evaluate the impact of uniform dress codes on a school's climate for student behavior and learning in four middle schools in North Carolina. The research will compare the perceptions of parents, teachers, and administrators in schools with uniform dress codes against schools without uniform dress codes. This…

  9. Reframing the Principle of Specialisation in Legitimation Code Theory: A Blended Learning Perspective

    Science.gov (United States)

    Owusu-Agyeman, Yaw; Larbi-Siaw, Otu

    2017-01-01

    This study argues that in developing a robust framework for students in a blended learning environment, Structural Alignment (SA) becomes the third principle of specialisation in addition to Epistemic Relation (ER) and Social Relation (SR). We provide an extended code: (ER+/-, SR+/-, SA+/-) that present strong classification and framing to the…

  10. The International Code of Marketing of Breast-milk Substitutes: lessons learned and implications for the regulation of marketing of foods and beverages to children.

    Science.gov (United States)

    Lutter, Chessa K

    2013-10-01

    To identify lessons learned from 30 years of implementing the International Code of Marketing of Breast-milk Substitutes (‘the Code’) and identify lessons learned for the regulation of marketing foods and beverages to children. Historical analysis of 30 years of implementing the Code. Latin America and the Caribbean. None. Legislation to restrict marketing of breast-milk substitutes is necessary but not sufficient; equally important are the promulgation of implementing regulations, effective enforcement and public monitoring of compliance. A system of funding for regular monitoring of compliance with legislation should be explicitlyd eveloped and funded from the beginning. Economic sanctions, while important, are likely to be less effective than reports that affect a company’s public image negatively. Non-governmental organizations play a critical role in leveraging public opinion and galvanizing consumer pressure to ensure that governments adopt regulations and companies adhere to them. Continual clinical, epidemiological and policy research showing the link between marketing and health outcomes and between policy and better health is essential. Implementation of the Code has not come easily as it places the interests of underfinanced national governments and international and non-governmental organizations promoting breast-feeding against those of multinational corporations that make hundreds of millions of dollars annually marketing infant formulas. Efforts to protect, promote and support breast-feeding have been successful with indicators of breast-feeding practices increasing globally. The lessons learned can inform current efforts to regulate the marketing of foods and beverages to children.

  11. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  12. Teacher Candidates Implementing Universal Design for Learning: Enhancing Picture Books with QR Codes

    Science.gov (United States)

    Grande, Marya; Pontrello, Camille

    2016-01-01

    The purpose of this study was to investigate if teacher candidates could gain knowledge of the principles of Universal Design for Learning by enhancing traditional picture books with Quick Response (QR) codes and to determine if the process of making these enhancements would impact teacher candidates' comfort levels with using technology on both…

  13. Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.

    Science.gov (United States)

    Gao, Wei; Kwong, Sam; Jia, Yuheng

    2017-08-25

    In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.

  14. Workshops and problems for benchmarking eddy current codes

    Energy Technology Data Exchange (ETDEWEB)

    Turner, L.R.; Davey, K.; Ida, N.; Rodger, D.; Kameari, A.; Bossavit, A.; Emson, C.R.I.

    1988-08-01

    A series of six workshops was held in 1986 and 1987 to compare eddy current codes, using six benchmark problems. The problems included transient and steady-state ac magnetic fields, close and far boundary conditions, magnetic and non-magnetic materials. All the problems were based either on experiments or on geometries that can be solved analytically. The workshops and solutions to the problems are described. Results show that many different methods and formulations give satisfactory solutions, and that in many cases reduced dimensionality or coarse discretization can give acceptable results while reducing the computer time required. A second two-year series of TEAM (Testing Electromagnetic Analysis Methods) workshops, using six more problems, is underway. 12 refs., 15 figs., 4 tabs.

  15. Workshops and problems for benchmarking eddy current codes

    International Nuclear Information System (INIS)

    Turner, L.R.; Davey, K.; Ida, N.; Rodger, D.; Kameari, A.; Bossavit, A.; Emson, C.R.I.

    1988-08-01

    A series of six workshops was held in 1986 and 1987 to compare eddy current codes, using six benchmark problems. The problems included transient and steady-state ac magnetic fields, close and far boundary conditions, magnetic and non-magnetic materials. All the problems were based either on experiments or on geometries that can be solved analytically. The workshops and solutions to the problems are described. Results show that many different methods and formulations give satisfactory solutions, and that in many cases reduced dimensionality or coarse discretization can give acceptable results while reducing the computer time required. A second two-year series of TEAM (Testing Electromagnetic Analysis Methods) workshops, using six more problems, is underway. 12 refs., 15 figs., 4 tabs

  16. Learning-Based Just-Noticeable-Quantization- Distortion Modeling for Perceptual Video Coding.

    Science.gov (United States)

    Ki, Sehwan; Bae, Sung-Ho; Kim, Munchurl; Ko, Hyunsuk

    2018-07-01

    Conventional predictive video coding-based approaches are reaching the limit of their potential coding efficiency improvements, because of severely increasing computation complexity. As an alternative approach, perceptual video coding (PVC) has attempted to achieve high coding efficiency by eliminating perceptual redundancy, using just-noticeable-distortion (JND) directed PVC. The previous JNDs were modeled by adding white Gaussian noise or specific signal patterns into the original images, which were not appropriate in finding JND thresholds due to distortion with energy reduction. In this paper, we present a novel discrete cosine transform-based energy-reduced JND model, called ERJND, that is more suitable for JND-based PVC schemes. Then, the proposed ERJND model is extended to two learning-based just-noticeable-quantization-distortion (JNQD) models as preprocessing that can be applied for perceptual video coding. The two JNQD models can automatically adjust JND levels based on given quantization step sizes. One of the two JNQD models, called LR-JNQD, is based on linear regression and determines the model parameter for JNQD based on extracted handcraft features. The other JNQD model is based on a convolution neural network (CNN), called CNN-JNQD. To our best knowledge, our paper is the first approach to automatically adjust JND levels according to quantization step sizes for preprocessing the input to video encoders. In experiments, both the LR-JNQD and CNN-JNQD models were applied to high efficiency video coding (HEVC) and yielded maximum (average) bitrate reductions of 38.51% (10.38%) and 67.88% (24.91%), respectively, with little subjective video quality degradation, compared with the input without preprocessing applied.

  17. Segmentation of MR images via discriminative dictionary learning and sparse coding: Application to hippocampus labeling

    OpenAIRE

    Tong, Tong; Wolz, Robin; Coupe, Pierrick; Hajnal, Joseph V.; Rueckert, Daniel

    2013-01-01

    International audience; We propose a novel method for the automatic segmentation of brain MRI images by using discriminative dictionary learning and sparse coding techniques. In the proposed method, dictionaries and classifiers are learned simultaneously from a set of brain atlases, which can then be used for the reconstruction and segmentation of an unseen target image. The proposed segmentation strategy is based on image reconstruction, which is in contrast to most existing atlas-based labe...

  18. [Learning virtual routes: what does verbal coding do in working memory?].

    Science.gov (United States)

    Gyselinck, Valérie; Grison, Élise; Gras, Doriane

    2015-03-01

    Two experiments were run to complete our understanding of the role of verbal and visuospatial encoding in the construction of a spatial model from visual input. In experiment 1 a dual task paradigm was applied to young adults who learned a route in a virtual environment and then performed a series of nonverbal tasks to assess spatial knowledge. Results indicated that landmark knowledge as asserted by the visual recognition of landmarks was not impaired by any of the concurrent task. Route knowledge, assessed by recognition of directions, was impaired both by a tapping task and a concurrent articulation task. Interestingly, the pattern was modulated when no landmarks were available to perform the direction task. A second experiment was designed to explore the role of verbal coding on the construction of landmark and route knowledge. A lexical-decision task was used as a verbal-semantic dual task, and a tone decision task as a nonsemantic auditory task. Results show that these new concurrent tasks impaired differently landmark knowledge and route knowledge. Results can be interpreted as showing that the coding of route knowledge could be grounded on both a coding of the sequence of events and on a semantic coding of information. These findings also point on some limits of Baddeley's working memory model. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  19. QR CODES IN EDUCATION AND COMMUNICATION

    Directory of Open Access Journals (Sweden)

    Gurhan DURAK

    2016-04-01

    Full Text Available Technological advances brought applications of innovations to education. Conventional education increasingly flourishes with new technologies accompanied by more learner active environments. In this continuum, there are learners preferring self-learning. Traditional learning materials yield attractive, motivating and technologically enhanced learning materials. The QR (Quick Response Codes are one of these innovations. The aim of this study is to redesign a lesson unit supported with QR Codes and to get the learner views about the redesigned material. For this purpose, the redesigned lesson unit was delivered to 15 learners in Balıkesir University in the academic year of 2013-2014. The learners were asked to study the material. The learners who had smart phones and Internet access were chosen for the study. To provide sectional diversity, three groups were created. The group learners were from Faculty of Education, Faculty of Science and Literature and Faculty of Engineering. After the semi-structured interviews were held, the learners were asked about their pre-knowledge about QR Codes, QR Codes’ contribution to learning, difficulties with using QR Codes about and design issues. Descriptive data analysis was used in the study. The findings were interpreted on the basis of Theory of Diffusion of Innovations and Theory of Uses and Gratifications. After the research, the themes found were awareness of QR Code, types of QR Codes and applications, contributions to learning, and proliferation of QR Codes. Generally, the learners participating in the study reported that they were aware of QR Codes; that they could use the QR Codes; and that using QR Codes in education was useful. They also expressed that such features as visual elements, attractiveness and direct routing had positive impact on learning. In addition, they generally mentioned that they did not have any difficulty using QR Codes; that they liked the design; and that the content should

  20. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.; Bensmail, H.; Yao, N.; Gao, Xin

    2013-01-01

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  1. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  2. QR Codes as Mobile Learning Tools for Labor Room Nurses at the San Pablo Colleges Medical Center

    Science.gov (United States)

    Del Rosario-Raymundo, Maria Rowena

    2017-01-01

    Purpose: The purpose of this paper is to explore the use of QR codes as mobile learning tools and examine factors that impact on their usefulness, acceptability and feasibility in assisting the nurses' learning. Design/Methodology/Approach: Study participants consisted of 14 regular, full-time, board-certified LR nurses. Over a two-week period,…

  3. Learning about Probability from Text and Tables: Do Color Coding and Labeling through an Interactive-User Interface Help?

    Science.gov (United States)

    Clinton, Virginia; Morsanyi, Kinga; Alibali, Martha W.; Nathan, Mitchell J.

    2016-01-01

    Learning from visual representations is enhanced when learners appropriately integrate corresponding visual and verbal information. This study examined the effects of two methods of promoting integration, color coding and labeling, on learning about probabilistic reasoning from a table and text. Undergraduate students (N = 98) were randomly…

  4. Case studies in Gaussian process modelling of computer codes

    International Nuclear Information System (INIS)

    Kennedy, Marc C.; Anderson, Clive W.; Conti, Stefano; O'Hagan, Anthony

    2006-01-01

    In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction. The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics

  5. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  6. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  7. The current status and future plans for the Monte Carlo codes MONK and MCBEND

    International Nuclear Information System (INIS)

    Smith, N.; Shuttleworth, T.; Grimstone, M.; Hutton, L.; Armishaw, M.; Bird, A.; France, N.; Connolly, S.

    2001-01-01

    MONK and MCBEND are software tools used for criticality/reactor physics and shielding/dosimetry/general radiation transport applications respectively. The codes are developed by a collaboration comprising AEA Technology and BNFL, and AEA Technology's ANSWERS Software Service supplies them to customers throughout the world on a commercial basis. In keeping with this commercial nature it is vital that the codes' development programmes evolve to meet the diverse expectations of the current and potential customer base. This involves striving to maintain an acceptable balance in the development of the various components that comprise a modern software package. This paper summarises the current status of MONK and MCBEND by indicating how the task of trying to achieve this difficult balance has been addressed in the recent past and is being addressed for the future. (orig.)

  8. The Effects of Single and Dual Coded Multimedia Instructional Methods on Chinese Character Learning

    Science.gov (United States)

    Wang, Ling

    2013-01-01

    Learning Chinese characters is a difficult task for adult English native speakers due to the significant differences between the Chinese and English writing system. The visuospatial properties of Chinese characters have inspired the development of instructional methods using both verbal and visual information based on the Dual Coding Theory. This…

  9. Advanced thermal-hydraulic and neutronic codes: current and future applications. Summary and conclusions

    International Nuclear Information System (INIS)

    2001-05-01

    An OECD Workshop on Advanced Thermal-Hydraulic and Neutronic Codes Applications was held from 10 to 13 April 2000, in Barcelona, Spain, sponsored by the Committee on the Safety of Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organised in collaboration with the Spanish Nuclear Safety Council (CSN) and hosted by CSN and the Polytechnic University of Catalonia (UPC) in collaboration with the Spanish Electricity Association (UNESA). The objectives of the Workshop were to review the developments since the previous CSNI Workshop held in Annapolis [NEA/CSNI/ R(97)4; NUREG/CP-0159], to analyse the present status of maturity and remnant needs of thermal-hydraulic (TH) and neutronic system codes and methods, and finally to evaluate the role of these tools in the evolving regulatory environment. The Technical Sessions and Discussion Sessions covered the following topics: - Regulatory requirements for Best-Estimate (BE) code assessment; - Application of TH and neutronic codes for current safety issues; - Uncertainty analysis; - Needs for integral plant transient and accident analysis; - Simulators and fast running codes; - Advances in next generation TH and neutronic codes; - Future trends in physical modeling; - Long term plans for development of advanced codes. The focus of the Workshop was on system codes. An incursion was made, however, in the new field of applying Computational Fluid Dynamic (CFD) codes to nuclear safety analysis. As a general conclusion, the Barcelona Workshop can be considered representative of the progress towards the targets marked at Annapolis almost four years ago. The Annapolis Workshop had identified areas where further development and specific improvements were needed, among them: multi-field models, transport of interfacial area, 2D and 3D thermal-hydraulics, 3-D neutronics consistent with level of details of thermal-hydraulics. Recommendations issued at Annapolis included: developing small pilot/test codes for

  10. Vectorization of the KENO V.a criticality safety code

    International Nuclear Information System (INIS)

    Hollenbach, D.F.; Dodds, H.L.; Petrie, L.M.

    1991-01-01

    The development of the vector processor, which is used in the current generation of supercomputers and is beginning to be used in workstations, provides the potential for dramatic speed-up for codes that are able to process data as vectors. Unfortunately, the stochastic nature of Monte Carlo codes prevents the old scalar version of these codes from taking advantage of the vector processors. New Monte Carlo algorithms that process all the histories undergoing the same event as a batch are required. Recently, new vectorized Monte Carlo codes have been developed that show significant speed-ups when compared to the scalar version of themselves or equivalent codes. This paper discusses the vectorization of an already existing and widely used criticality safety code, KENO V.a All the changes made to KENO V.a are transparent to the user making it possible to upgrade from the standard scalar version of KENO V.a to the vectorized version without learning a new code

  11. Code Development and Validation Towards Modeling and Diagnosing Current Redistribution in an ITER-Type Superconducting Cable Subject to Current Imbalance

    International Nuclear Information System (INIS)

    Zani, L.; Gille, P.E.; Gonzales, C.; Kuppel, S.; Torre, A.

    2009-01-01

    In the framework of ITER magnet R and D activities, a significant number of conductor short-samples or inserts were tested throughout the past decades, either for development on cable layouts or for industrial qualifications. On a certain number of them critical properties degradations were encountered, some of which were identified to be caused by current imbalance between the different strands bundles twisted inside the cable. In order to address the analyses of those samples as reliably as possible, CEA developed a dedicated home code named Coupled Algorithm Resistive Modelling Electrical Network (CARMEN) having basically two specific functionalities: -a first routine which is devoted to compute strand bundles trajectories, with bundles down to the individual strand scale. This point allows to obtain a realistic E(J) law over the full conductor length -a second routine which is devoted to model inter-bundle currents redistribution, taking into account the magnetic field map. It basically makes use of a relevant discrete electrical network with defined sections including E(J) law obtained from the above-mentioned subroutine As a result, the E-J or E-T curves can be calculated and compared to the experimental data, provided adapted inputs on sample features are considered, such as strand contact resistances in joints, inter-bundles resistances or cable geometry. In a first part, the paper describes the different hypotheses that built the code structure, and in a second part, the application to the ITER TFCl insert coil is presented, focusing particularly on the validation of the potential use of the code to stand as a diagnostic tool for currents imbalance probing

  12. LIFELONG LEARNING THROUGH SECOND LIFE: CURRENT TRENDS, POTENTIALS AND LIMITATIONS

    Directory of Open Access Journals (Sweden)

    Nil GOKSEL-CANBEK

    2011-08-01

    Full Text Available Lifelong Learning (LLL has been a remarkable response to people-centered educational demand of 21st century. In order to provide effective formal, non-formal, and informal learning, immersive educational activities undertaken throughout life should be aimed to create a learning society in which people can experience individual and collective learning with no constrains of time or location. The concept of lifelong learning within the context of distance immersive education encompasses diverse 3D activities. The three dimensional, Web-based structured activities supported by distance learning technologies can be viewed as interactive tools which foster LLL. In this perspective, Second Life (SL can be regarded as one of the learning simulation milieus that allow learners to participate in various educational LLL activities in individual or group forms. The following paper examines how SL, taking advantage of its simulative nature and the possibility for creative interaction among participants, which are also common in games, allows the learners to participate in immersive constructivist learning activities. The article will also touch on the current uses of SL as a tool for LLL, as well as its potentials for further development according to the current trends in adult education. Further, the authors will discuss its limitations and will make suggestions towards a more complete pedagogical use.

  13. Neural coding of basic reward terms of animal learning theory, game theory, microeconomics and behavioural ecology.

    Science.gov (United States)

    Schultz, Wolfram

    2004-04-01

    Neurons in a small number of brain structures detect rewards and reward-predicting stimuli and are active during the expectation of predictable food and liquid rewards. These neurons code the reward information according to basic terms of various behavioural theories that seek to explain reward-directed learning, approach behaviour and decision-making. The involved brain structures include groups of dopamine neurons, the striatum including the nucleus accumbens, the orbitofrontal cortex and the amygdala. The reward information is fed to brain structures involved in decision-making and organisation of behaviour, such as the dorsolateral prefrontal cortex and possibly the parietal cortex. The neural coding of basic reward terms derived from formal theories puts the neurophysiological investigation of reward mechanisms on firm conceptual grounds and provides neural correlates for the function of rewards in learning, approach behaviour and decision-making.

  14. A Dual Coding View of Vocabulary Learning

    Science.gov (United States)

    Sadoski, Mark

    2005-01-01

    A theoretical perspective on acquiring sight vocabulary and developing meaningful vocabulary is presented. Dual Coding Theory assumes that cognition occurs in two independent but connected codes: a verbal code for language and a nonverbal code for mental imagery. The mixed research literature on using pictures in teaching sight vocabulary is…

  15. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  16. Neural Decoder for Topological Codes

    Science.gov (United States)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  17. Uniform physical theory of diffraction equivalent edge currents for implementation in general computer codes

    DEFF Research Database (Denmark)

    Johansen, Peter Meincke

    1996-01-01

    New uniform closed-form expressions for physical theory of diffraction equivalent edge currents are derived for truncated incremental wedge strips. In contrast to previously reported expressions, the new expressions are well-behaved for all directions of incidence and observation and take a finite...... value for zero strip length. Consequently, the new equivalent edge currents are, to the knowledge of the author, the first that are well-suited for implementation in general computer codes...

  18. Code to Learn: Where Does It Belong in the K-12 Curriculum?

    OpenAIRE

    Jesús Moreno León; Gregorio Robles; Marcos Román-González

    2016-01-01

    The introduction of computer programming in K-12 has become mainstream in the last years, as countries around the world are making coding part of their curriculum. Nevertheless, there is a lack of empirical studies that investigate how learning to program at an early age affects other school subjects. In this regard, this paper compares three quasi-experimental research designs conducted in three different schools (n=129 students from 2nd and 6th grade), in order to assess the impact of intro...

  19. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  20. Assessment of Programming Language Learning Based on Peer Code Review Model: Implementation and Experience Report

    Science.gov (United States)

    Wang, Yanqing; Li, Hang; Feng, Yuqiang; Jiang, Yu; Liu, Ying

    2012-01-01

    The traditional assessment approach, in which one single written examination counts toward a student's total score, no longer meets new demands of programming language education. Based on a peer code review process model, we developed an online assessment system called "EduPCR" and used a novel approach to assess the learning of computer…

  1. Understanding teachers’ professional learning goals from their current professional concerns

    NARCIS (Netherlands)

    Louws, Monika L.; Meirink, Jacobiene A.; van Veen, Klaas; van Driel, Jan H.

    In the day-to-day workplace teachers direct their own learning, but little is known about what drives their decisions about what they would like to learn. These decisions are assumed to be influenced by teachers’ current professional concerns. Also, teachers in different professional life phases

  2. Ethical and educational considerations in coding hand surgeries.

    Science.gov (United States)

    Lifchez, Scott D; Leinberry, Charles F; Rivlin, Michael; Blazar, Philip E

    2014-07-01

    To assess treatment coding knowledge and practices among residents, fellows, and attending hand surgeons. Through the use of 6 hypothetical cases, we developed a coding survey to assess coding knowledge and practices. We e-mailed this survey to residents, fellows, and attending hand surgeons. In additionally, we asked 2 professional coders to code these cases. A total of 71 participants completed the survey out of 134 people to whom the survey was sent (response rate = 53%). We observed marked disparity in codes chosen among surgeons and among professional coders. Results of this study indicate that coding knowledge, not just its ethical application, had a major role in coding procedures accurately. Surgical coding is an essential part of a hand surgeon's practice and is not well learned during residency or fellowship. Whereas ethical issues such as deliberate unbundling and upcoding may have a role in inaccurate coding, lack of knowledge among surgeons and coders has a major role as well. Coding has a critical role in every hand surgery practice. Inconstancies among those polled in this study reveal that an increase in education on coding during training and improvement in the clarity and consistency of the Current Procedural Terminology coding rules themselves are needed. Copyright © 2014 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  3. A Coding Scheme to Analyse the Online Asynchronous Discussion Forums of University Students

    Science.gov (United States)

    Biasutti, Michele

    2017-01-01

    The current study describes the development of a content analysis coding scheme to examine transcripts of online asynchronous discussion groups in higher education. The theoretical framework comprises the theories regarding knowledge construction in computer-supported collaborative learning (CSCL) based on a sociocultural perspective. The coding…

  4. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  5. Identifying and acting on potentially inappropriate care? Inadequacy of current hospital coding for this task.

    Science.gov (United States)

    Cooper, P David; Smart, David R

    2017-06-01

    Recent Australian attempts to facilitate disinvestment in healthcare, by identifying instances of 'inappropriate' care from large Government datasets, are subject to significant methodological flaws. Amongst other criticisms has been the fact that the Government datasets utilized for this purpose correlate poorly with datasets collected by relevant professional bodies. Government data derive from official hospital coding, collected retrospectively by clerical personnel, whilst professional body data derive from unit-specific databases, collected contemporaneously with care by clinical personnel. Assessment of accuracy of official hospital coding data for hyperbaric services in a tertiary referral hospital. All official hyperbaric-relevant coding data submitted to the relevant Australian Government agencies by the Royal Hobart Hospital, Tasmania, Australia for financial year 2010-2011 were reviewed and compared against actual hyperbaric unit activity as determined by reference to original source documents. Hospital coding data contained one or more errors in diagnoses and/or procedures in 70% of patients treated with hyperbaric oxygen that year. Multiple discrete error types were identified, including (but not limited to): missing patients; missing treatments; 'additional' treatments; 'additional' patients; incorrect procedure codes and incorrect diagnostic codes. Incidental observations of errors in surgical, anaesthetic and intensive care coding within this cohort suggest that the problems are not restricted to the specialty of hyperbaric medicine alone. Publications from other centres indicate that these problems are not unique to this institution or State. Current Government datasets are irretrievably compromised and not fit for purpose. Attempting to inform the healthcare policy debate by reference to these datasets is inappropriate. Urgent clinical engagement with hospital coding departments is warranted.

  6. Code to Learn: Where Does It Belong in the K-12 Curriculum?

    Directory of Open Access Journals (Sweden)

    Jesús Moreno León

    2016-06-01

    Full Text Available The introduction of computer programming in K-12 has become mainstream in the last years, as countries around the world are making coding part of their curriculum. Nevertheless, there is a lack of empirical studies that investigate how learning to program at an early age affects other school subjects. In this regard, this paper compares three quasi-experimental research designs conducted in three different schools (n=129 students from 2nd and 6th grade, in order to assess the impact of introducing programming with Scratch at different stages and in several subjects. While both 6th grade experimental groups working with coding activities showed a statistically significant improvement in terms of academic performance, this was not the case in the 2nd grade classroom. Notable disparity was also found regarding the subject in which the programming activities were included, as in social studies the effect size was double that in mathematics.

  7. Corporate Blended Learning in Portugal: Current Status and Future Directions

    Science.gov (United States)

    Marcal, Julia; Caetano, Antonio

    2010-01-01

    The aim of this study is to characterize the current status of blended learning in Portugal, given that b-learning has grown exponentially in the Portuguese market over recent years. 38 organizations (representing 68% of all institutions certified to provide distance training by the Government Labour Office--DGERT-) participated in this study. The…

  8. Libraries as Facilitators of Coding for All

    Science.gov (United States)

    Martin, Crystle

    2017-01-01

    Learning to code has been an increasingly frequent topic of conversation both in academic circles and popular media. Learning to code recently received renewed attention with the announcement of the White House's Computer Science for All initiative (Smith 2016). This initiative intends "to empower all American students from kindergarten…

  9. Proceedings of the workshop on advanced thermal-hydraulic and neutronic codes: current and future applications

    International Nuclear Information System (INIS)

    2001-01-01

    An OECD Workshop on Advanced Thermal-Hydraulic and Neutronic Codes Applications was held from 10 to 13 April 2000, in Barcelona, Spain, sponsored by the Committee on the Safety of Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organised in collaboration with the Spanish Nuclear Safety Council (CSN) and hosted by CSN and the Polytechnic University of Catalonia (UPC) in collaboration with the Spanish Electricity Association (UNESA). The objectives of the Workshop were to review the developments since the previous CSNI Workshop held in Annapolis [NEA/CSNI/ R(97)4; NUREG/CP-0159], to analyse the present status of maturity and remnant needs of thermal-hydraulic (TH) and neutronic system codes and methods, and finally to evaluate the role of these tools in the evolving regulatory environment. The Technical Sessions and Discussion Sessions covered the following topics: - Regulatory requirements for Best-Estimate (BE) code assessment; - Application of TH and neutronic codes for current safety issues; - Uncertainty analysis; - Needs for integral plant transient and accident analysis; - Simulators and fast running codes; - Advances in next generation TH and neutronic codes; - Future trends in physical modeling; - Long term plans for development of advanced codes. The focus of the Workshop was on system codes. An incursion was made, however, in the new field of applying Computational Fluid Dynamic (CFD) codes to nuclear safety analysis. As a general conclusion, the Barcelona Workshop can be considered representative of the progress towards the targets marked at Annapolis almost four years ago. The Annapolis Workshop had identified areas where further development and specific improvements were needed, among them: multi-field models, transport of interfacial area, 2D and 3D thermal-hydraulics, 3-D neutronics consistent with level of details of thermal-hydraulics. Recommendations issued at Annapolis included: developing small pilot/test codes for

  10. Learning through reactions

    DEFF Research Database (Denmark)

    Hasse, Cathrine

    2007-01-01

    Universities can from the student?s point of view be seen as places of learning an explicit curriculum of a particular discipline. From a fieldwork among physicist students at the Niels Bohr Institute in Denmark, I argue that the learning of cultural code-curricula in higher educational...... institutions designate in ambiguous ways. I argue claim that students also have to learn institutional cultural codes, which are not the explicit curricula presented in textbooks, but a socially designated cultural code-curricula learned through everyday interactions at the university institutes. I further...... argue that this code-curriculum is learned through what I shall term indefinite learning processes, which are mainly pre-discursive to the newcomer...

  11. Combined convective and diffusive modeling of the ring current and radiation belt electron dynamics using the VERB-4D code

    Science.gov (United States)

    Aseev, N.; Shprits, Y.; Drozdov, A.; Kellerman, A. C.; Wang, D.

    2017-12-01

    Ring current and radiation belts are key elements in the global dynamics of the Earth's magnetosphere. Comprehensive mathematical models are useful tools that allow us to understand the multiscale dynamics of these charged particle populations. In this work, we present results of simulations of combined ring current - radiation belt electron dynamics using the four-dimensional Versatile Electron Radiation Belt (VERB-4D) code. The VERB-4D code solves the modified Fokker-Planck equation including convective terms and models simultaneously ring current (1 - 100 keV) and radiation belt (100 keV - several MeV) electron dynamics. We apply the code to the number of geomagnetic storms that occurred in the past, compare the results with different satellite observations, and show how low-energy particles can affect the high-energy populations. Particularly, we use data from Polar Operational Environmental Satellite (POES) mission that provides a very good MLT coverage with 1.5-hour time resolution. The POES data allow us to validate the approach of the VERB-4D code for modeling MLT-dependent processes such as electron drift, wave-particle interactions, and magnetopause shadowing. We also show how different simulation parameters and empirical models can affect the results, making a particular emphasis on the electric and magnetic field models. This work will help us reveal advantages and disadvantages of the approach behind the code and determine its prediction efficiency.

  12. Mobile Learning in Nursing Undergraduates in China: Current Status, Attitudes and Barriers.

    Science.gov (United States)

    Xiao, Qian; Zhang, Qiannan; Wang, Lanlan; Wang, Yanling; Sun, Liu; Wu, Ying

    2017-01-01

    To explore the current status, attitudes and barriers of nursing undergraduates toward mobile learning, 157 nursing students were investigated. more than half of them used mobile learning frequently in past half year. The mean score of students' intention towards mobile learning was 10.5 (ranged from 6 to 15), and it related to students' gender, expected effect, ease of operation, influence of other students, self-learning management and perceived interest. Some barriers affected students' mobile learning. Therefore, students had positive attitude and perception toward mobile learning, then we should create enough conditions to promote students' mobile learning.

  13. Code quality issues in student programs

    NARCIS (Netherlands)

    Keuning, H.W.; Heeren, B.J.; Jeuring, J.T.

    2017-01-01

    Because low quality code can cause serious problems in software systems, students learning to program should pay attention to code quality early. Although many studies have investigated mistakes that students make during programming, we do not know much about the quality of their code. This study

  14. Plasma simulation by macroscale, electromagnetic particle code and its application to current-drive by relativistic electron beam injection

    International Nuclear Information System (INIS)

    Tanaka, M.; Sato, T.

    1985-01-01

    A new implicit macroscale electromagnetic particle simulation code (MARC) which allows a large scale length and a time step in multi-dimensions is described. Finite mass electrons and ions are used with relativistic version of the equation of motion. The electromagnetic fields are solved by using a complete set of Maxwell equations. For time integration of the field equations, a decentered (backward) finite differencing scheme is employed with the predictor - corrector method for small noise and super-stability. It is shown both in analytical and numerical ways that the present scheme efficiently suppresses high frequency electrostatic and electromagnetic waves in a plasma, and that it accurately reproduces low frequency waves such as ion acoustic waves, Alfven waves and fast magnetosonic waves. The present numerical scheme has currently been coded in three dimensions for application to a new tokamak current-drive method by means of relativistic electron beam injection. Some remarks of the proper macroscale code application is presented in this paper

  15. Learning analytics fundaments, applications, and trends : a view of the current state of the art to enhance e-learning

    CERN Document Server

    2017-01-01

    This book provides a conceptual and empirical perspective on learning analytics, its goal being to disseminate the core concepts, research, and outcomes of this emergent field. Divided into nine chapters, it offers reviews oriented on selected topics, recent advances, and innovative applications. It presents the broad learning analytics landscape and in-depth studies on higher education, adaptive assessment, teaching and learning. In addition, it discusses valuable approaches to coping with personalization and huge data, as well as conceptual topics and specialized applications that have shaped the current state of the art. By identifying fundamentals, highlighting applications, and pointing out current trends, the book offers an essential overview of learning analytics to enhance learning achievement in diverse educational settings. As such, it represents a valuable resource for researchers, practitioners, and students interested in updating their knowledge and finding inspirations for their future work.

  16. LeARN: a platform for detecting, clustering and annotating non-coding RNAs

    Directory of Open Access Journals (Sweden)

    Schiex Thomas

    2008-01-01

    Full Text Available Abstract Background In the last decade, sequencing projects have led to the development of a number of annotation systems dedicated to the structural and functional annotation of protein-coding genes. These annotation systems manage the annotation of the non-protein coding genes (ncRNAs in a very crude way, allowing neither the edition of the secondary structures nor the clustering of ncRNA genes into families which are crucial for appropriate annotation of these molecules. Results LeARN is a flexible software package which handles the complete process of ncRNA annotation by integrating the layers of automatic detection and human curation. Conclusion This software provides the infrastructure to deal properly with ncRNAs in the framework of any annotation project. It fills the gap between existing prediction software, that detect independent ncRNA occurrences, and public ncRNA repositories, that do not offer the flexibility and interactivity required for annotation projects. The software is freely available from the download section of the website http://bioinfo.genopole-toulouse.prd.fr/LeARN

  17. [Birth and succession of a current of learning in Korean medicine: the supporting yang current of learning].

    Science.gov (United States)

    Oh, Chaekun

    2014-04-01

    In this study, I aim to reveal how Lee Gyoojoons medicine has given birth to a current of learning, the supporting yang current of learning, and describe its historical significance. Before anything, I'd like to throw the question of whether if there were any currents within the traditional Korean medicine. There are no records of medical currents being widely discussed until now in medical history of Korea; however, the current of Lee Jema's sasang medicine is the most noticeable one. Among the contemporaries of Lee Jema, during the late Chosun, there was also another famed medical practitioner called Lee Gyoojoon. Lee Gyoojoon mainly practiced his medicine within Pohang, Gyeongsangbuk-do area, his apprentices have formed a group and have succeeded his medical practice. Based on the analyses of Lee Gyoojoon's apprentices and the Somun Oriental Medical Society, which is known as a successor group to Lee Gyoojoon's medicine today, they are fully satisfying the five requirements to establish a medical current: first, they held Lee Gyoojoon as the first and foremost, representative practitioner of their current; second, they advocate the supporting yang theory suggested by Lee Gyoojoon, which is originated from his theory of Mind; third, books such as the Major Essentials of Huangdi's Internal Classic Plain Questions, and the Double Grinded Medical Mirror, were being used as the main textbooks to educate their students or to practice medicine. Fourth, Lee Gyoojoon's medical ideas were being transcended quite clearly within his group of apprentices, including Seo Byungoh, Lee Wonse, and the Somun Oriental Medical Society. Fifth, Lee Gyoojoon's apprentices were first produced through the Sukgok School, however, nowadays they are being produced through medical groups formed by Lee Wonse, the Somun Oriental Medical Society, regarding the propagation of medical theories, compilation of textbooks, publication of academic journals, etc. Then, what do the existence of the

  18. Understanding Teachers' Professional Learning Goals from Their Current Professional Concerns

    Science.gov (United States)

    Louws, Monika L.; Meirink, Jacobiene A.; van Veen, Klaas; van Driel, Jan H.

    2018-01-01

    In the day-to-day workplace teachers direct their own learning, but little is known about what drives their decisions about what they would like to learn. These decisions are assumed to be influenced by teachers' current professional concerns. Also, teachers in different professional life phases have different reasons for engaging in professional…

  19. Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

    Science.gov (United States)

    Coutinho, Eduardo; Schuller, Björn

    2017-01-01

    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.

  20. Current and anticipated uses of thermal-hydraulic codes in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Teschendorff, V.; Sommer, F.; Depisch, F.

    1997-07-01

    In Germany, one third of the electrical power is generated by nuclear plants. ATHLET and S-RELAP5 are successfully applied for safety analyses of the existing PWR and BWR reactors and possible future reactors, e.g. EPR. Continuous development and assessment of thermal-hydraulic codes are necessary in order to meet present and future needs of licensing organizations, utilities, and vendors. Desired improvements include thermal-hydraulic models, multi-dimensional simulation, computational speed, interfaces to coupled codes, and code architecture. Real-time capability will be essential for application in full-scope simulators. Comprehensive code validation and quantification of uncertainties are prerequisites for future best-estimate analyses.

  1. Current and anticipated uses of thermal-hydraulic codes in Germany

    International Nuclear Information System (INIS)

    Teschendorff, V.; Sommer, F.; Depisch, F.

    1997-01-01

    In Germany, one third of the electrical power is generated by nuclear plants. ATHLET and S-RELAP5 are successfully applied for safety analyses of the existing PWR and BWR reactors and possible future reactors, e.g. EPR. Continuous development and assessment of thermal-hydraulic codes are necessary in order to meet present and future needs of licensing organizations, utilities, and vendors. Desired improvements include thermal-hydraulic models, multi-dimensional simulation, computational speed, interfaces to coupled codes, and code architecture. Real-time capability will be essential for application in full-scope simulators. Comprehensive code validation and quantification of uncertainties are prerequisites for future best-estimate analyses

  2. Expressing Youth Voice through Video Games and Coding

    Science.gov (United States)

    Martin, Crystle

    2017-01-01

    A growing body of research focuses on the impact of video games and coding on learning. The research often elevates learning the technical skills associated with video games and coding or the importance of problem solving and computational thinking, which are, of course, necessary and relevant. However, the literature less often explores how young…

  3. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G. [Paul Scherrer Institut, Villigen (Switzerland)

    1997-07-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together with some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.

  4. Current and anticipated uses of thermalhydraulic and neutronic codes at PSI

    International Nuclear Information System (INIS)

    Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.

    1997-01-01

    The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together with some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized

  5. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    Nowadays, emerging applications such as wireless visual sensor networks and wireless video surveillance are requiring lightweight video encoding with high coding efficiency and error-resilience. Distributed Video Coding (DVC) is a new coding paradigm which exploits the source statistics...... and noise modeling and also learn from the previous decoded Wyner-Ziv (WZ) frames, side information and noise learning (SING) is proposed. The SING scheme introduces an optical flow technique to compensate the weaknesses of the block based SI generation and also utilizes clustering of DCT blocks to capture...... cross band correlation and increase local adaptivity in noise modeling. During decoding, the updated information is used to iteratively reestimate the motion and reconstruction in the proposed motion and reconstruction reestimation (MORE) scheme. The MORE scheme not only reestimates the motion vectors...

  6. Current Status of the LIFE Fast Reactors Fuel Performance Codes

    International Nuclear Information System (INIS)

    Yacout, A.M.; Billone, M.C.

    2013-01-01

    The LIFE-4 (Rev. 1) code was calibrated and validated using data from (U,Pu)O2 mixed-oxide fuel pins and UO2 blanket rods which were irradiation tested under steady-state and transient conditions. – It integrates a broad material and fuel-pin irradiation database into a consistent framework for use and extrapolation of the database to reactor design applications. – The code is available and running on different computer platforms (UNIX & PC) – Detailed documentations of the code’s models, routines, calibration and validation data sets are available. LIFE-METAL code is based on LIFE4 with modifications to include key phenomena applicable to metallic fuel, and metallic fuel properties – Calibrated with large database from irradiations in EBR-II – Further effort for calibration and detailed documentation. Recent activities with the codes are related to reactor design studies and support of licensing efforts for 4S and KAERI SFR designs. Future activities are related to re-assessment of the codes calibration and validation and inclusion of models for advanced fuels (transmutation fuels)

  7. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  8. A Machine Learning Perspective on Predictive Coding with PAQ

    OpenAIRE

    Knoll, Byron; de Freitas, Nando

    2011-01-01

    PAQ8 is an open source lossless data compression algorithm that currently achieves the best compression rates on many benchmarks. This report presents a detailed description of PAQ8 from a statistical machine learning perspective. It shows that it is possible to understand some of the modules of PAQ8 and use this understanding to improve the method. However, intuitive statistical explanations of the behavior of other modules remain elusive. We hope the description in this report will be a sta...

  9. Summary of papers on current and anticipated uses of thermal-hydraulic codes

    Energy Technology Data Exchange (ETDEWEB)

    Caruso, R.

    1997-07-01

    The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especially faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).

  10. Summary of papers on current and anticipated uses of thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Caruso, R.

    1997-01-01

    The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especially faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the 'user effect' is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices)

  11. Learning Styles of Pilots Currently Qualified in United States Air Force Aircraft

    Science.gov (United States)

    Kanske, Craig A.

    2001-01-01

    Kolb's Learning Style Inventory was used to identify the predominant learning styles of pilots currently qualified in United States Air Force aircraft. The results indicate that these pilots show a significant preference for facts and things over people and feelings. By understanding the preferred learning styles of the target population, course material can be developed that take advantage of the strengths of these learning styles. This information can be especially useful in the future design of cockpit resource management training. The training program can be developed to demonstrate both that there are different learning styles and that it is possible to take advantage of the relative strengths of each of these learning styles.

  12. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  13. Predictions of the spontaneous symmetry-breaking theory for visual code completeness and spatial scaling in single-cell learning rules.

    Science.gov (United States)

    Webber, C J

    2001-05-01

    This article shows analytically that single-cell learning rules that give rise to oriented and localized receptive fields, when their synaptic weights are randomly and independently initialized according to a plausible assumption of zero prior information, will generate visual codes that are invariant under two-dimensional translations, rotations, and scale magnifications, provided that the statistics of their training images are sufficiently invariant under these transformations. Such codes span different image locations, orientations, and size scales with equal economy. Thus, single-cell rules could account for the spatial scaling property of the cortical simple-cell code. This prediction is tested computationally by training with natural scenes; it is demonstrated that a single-cell learning rule can give rise to simple-cell receptive fields spanning the full range of orientations, image locations, and spatial frequencies (except at the extreme high and low frequencies at which the scale invariance of the statistics of digitally sampled images must ultimately break down, because of the image boundary and the finite pixel resolution). Thus, no constraint on completeness, or any other coupling between cells, is necessary to induce the visual code to span wide ranges of locations, orientations, and size scales. This prediction is made using the theory of spontaneous symmetry breaking, which we have previously shown can also explain the data-driven self-organization of a wide variety of transformation invariances in neurons' responses, such as the translation invariance of complex cell response.

  14. Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

    Directory of Open Access Journals (Sweden)

    Eduardo Coutinho

    Full Text Available Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech and cross-domain experiments (i.e., models trained in one modality and tested on the other. In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.

  15. Shared acoustic codes underlie emotional communication in music and speech—Evidence from deep transfer learning

    Science.gov (United States)

    Schuller, Björn

    2017-01-01

    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies—the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain. PMID:28658285

  16. Online Learning Integrity Approaches: Current Practices and Future Solutions

    Science.gov (United States)

    Lee-Post, Anita; Hapke, Holly

    2017-01-01

    The primary objective of this paper is to help institutions respond to the stipulation of the Higher Education Opportunity Act of 2008 by adopting cost-effective academic integrity solutions without compromising the convenience and flexibility of online learning. Current user authentication solutions such as user ID and password, security…

  17. Neutrons and gamma transport in atmosphere by Tripoli-2 code. Energy deposit and electron current time function

    International Nuclear Information System (INIS)

    Vergnaud, T.; Nimal, J.C.; Ulpat, J.P.; Faucheux, G.

    1988-01-01

    The Tripoli-2 computer code has been adapted to calculate, in addition to energy deposit in matter by neutrons (Kerma) the energy deposit by gamma produced in neutronic impacts and the induced recoil electron current. The energy deposit conduces at air ionization, consequently at a conductibility. This knowledge added at that of electron current permit to resolve the Maxwell equations of electromagnetic field. The study is realized for an atmospheric explosion 100 meters high. The calculations of energy deposit and electron current have been conducted as far as 2.5km [fr

  18. pix2code: Generating Code from a Graphical User Interface Screenshot

    OpenAIRE

    Beltramelli, Tony

    2017-01-01

    Transforming a graphical user interface screenshot created by a designer into computer code is a typical task conducted by a developer in order to build customized software, websites, and mobile applications. In this paper, we show that deep learning methods can be leveraged to train a model end-to-end to automatically generate code from a single input image with over 77% of accuracy for three different platforms (i.e. iOS, Android and web-based technologies).

  19. Enhanced motor learning with bilateral transcranial direct current stimulation: Impact of polarity or current flow direction?

    Science.gov (United States)

    Naros, Georgios; Geyer, Marc; Koch, Susanne; Mayr, Lena; Ellinger, Tabea; Grimm, Florian; Gharabaghi, Alireza

    2016-04-01

    Bilateral transcranial direct current stimulation (TDCS) is superior to unilateral TDCS when targeting motor learning. This effect could be related to either the current flow direction or additive polarity-specific effects on each hemisphere. This sham-controlled randomized study included fifty right-handed healthy subjects in a parallel-group design who performed an exoskeleton-based motor task of the proximal left arm on three consecutive days. Prior to training, we applied either sham, right anodal (a-TDCS), left cathodal (c-TDCS), concurrent a-TDCS and c-TDCS with two independent current sources and return electrodes (double source (ds)-TDCS) or classical bilateral stimulation (bi-TDCS). Motor performance improved over time for both unilateral (a-TDCS, c-TDCS) and bilateral (bi-TDCS, ds-TDCS) TDCS montages. However, only the two bilateral paradigms led to an improvement of the final motor performance at the end of the training period as compared to the sham condition. There was no difference between the two bilateral stimulation conditions (bi-TDCS, ds-TDCS). Bilateral TDCS is more effective than unilateral stimulation due to its polarity-specific effects on each hemisphere rather than due to its current flow direction. This study is the first systematic evaluation of stimulation polarity and current flow direction of bi-hemispheric motor cortex TDCS on motor learning of proximal upper limb muscles. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  20. User's guide for SLWDN9, a code for calculating flux-surfaced-averaging of alpha densities, currents, and heating in non-circular tokamaks

    International Nuclear Information System (INIS)

    Hively, L.M.; Miley, G.M.

    1980-03-01

    The code calculates flux-surfaced-averaged values of alpha density, current, and electron/ion heating profiles in realistic, non-circular tokamak plasmas. The code is written in FORTRAN and execute on the CRAY-1 machine at the Magnetic Fusion Energy Computer Center

  1. Investigating students' view on STEM in learning about electrical current through STS approach

    Science.gov (United States)

    Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study aims to investigate Grade 11 students' views on Science Technology Engineering Mathematics (STEM) with the integration of learning about electrical current based on Science Technology Society (STS) approach [8]. The participants were 60 Grade 11 students in Demonstration Secondary School, Khon Kaen University, Khon Kaen Province, Thailand. The methodology is in the respect of interpretive paradigm. The teaching and learning about Electrical Current through STS approach carried out over 6 weeks. The Electrical Current unit through STS approach was developed based on framework[8] that consists of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision making, and (5) socialization stage. To start with, the question "what if this world is lack of electricity" was challenged in the class in order to move students to find the problem of how to design Electricity Generation from Clean Energy. Students were expected to apply scientific and other knowledge to design of Electricity Generation. Students' views on STEM were collected during their learning by participant' observation and students' tasks. Their views on STEM were categorized when they applied their knowledge for designing the Electricity Generation. The findings indicated that students cooperatively work to solve the problem when applying knowledge about the content of Science and Mathematics and processing skill of Technology and Engineering. It showed that students held the integration of science, technology, engineering and mathematics to design their possible solutions in learning about Electrical Current. The paper also discusses implications for science teaching and learning through STS in Thailand.

  2. The assertive communication: a current need of the learning process

    Directory of Open Access Journals (Sweden)

    Georgina Amayuela Mora

    2015-09-01

    Full Text Available The fundamental purpose of this article is to characterize the assertiveness as a component of the communicative competence. The study of the communicative process is a current need, since from the quality of the communication depends to a great extent the student’s formation. The learning process in the university context requires an assertive communicative process. In this paper the assertiveness is defined as a communicative skill and is valued the importance of an assertive behavior through its positive impact in the learning process.

  3. Polarity-Specific Transcranial Direct Current Stimulation Disrupts Auditory Pitch Learning

    Directory of Open Access Journals (Sweden)

    Reiko eMatsushita

    2015-05-01

    Full Text Available Transcranial direct current stimulation (tDCS is attracting increasing interest because of its potential for therapeutic use. While its effects have been investigated mainly with motor and visual tasks, less is known in the auditory domain. Past tDCS studies with auditory tasks demonstrated various behavioural outcomes, possibly due to differences in stimulation parameters or task measurements used in each study. Further research using well-validated tasks are therefore required for clarification of behavioural effects of tDCS on the auditory system. Here, we took advantage of findings from a prior functional magnetic resonance imaging study, which demonstrated that the right auditory cortex is modulated during fine-grained pitch learning of microtonal melodic patterns. Targeting the right auditory cortex with tDCS using this same task thus allowed us to test the hypothesis that this region is causally involved in pitch learning. Participants in the current study were trained for three days while we measured pitch discrimination thresholds using microtonal melodies on each day using a psychophysical staircase procedure. We administered anodal, cathodal, or sham tDCS to three groups of participants over the right auditory cortex on the second day of training during performance of the task. Both the sham and the cathodal groups showed the expected significant learning effect (decreased pitch threshold over the three days of training; in contrast we observed a blocking effect of anodal tDCS on auditory pitch learning, such that this group showed no significant change in thresholds over the three days. The results support a causal role for the right auditory cortex in pitch discrimination learning.

  4. Applying machine learning to predict patient-specific current CD4 ...

    African Journals Online (AJOL)

    Apple apple

    This work shows the application of machine learning to predict current CD4 cell count of an HIV- .... Pre-processing ... remaining data elements of the PR and RT datasets. ... technique based on the structure of the human brain's neuron.

  5. Light-water reactor safety analysis codes

    International Nuclear Information System (INIS)

    Jackson, J.F.; Ransom, V.H.; Ybarrondo, L.J.; Liles, D.R.

    1980-01-01

    A brief review of the evolution of light-water reactor safety analysis codes is presented. Included is a summary comparison of the technical capabilities of major system codes. Three recent codes are described in more detail to serve as examples of currently used techniques. Example comparisons between calculated results using these codes and experimental data are given. Finally, a brief evaluation of current code capability and future development trends is presented

  6. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  7. Phonetic radicals, not phonological coding systems, support orthographic learning via self-teaching in Chinese.

    Science.gov (United States)

    Li, Luan; Wang, Hua-Chen; Castles, Anne; Hsieh, Miao-Ling; Marinus, Eva

    2018-07-01

    According to the self-teaching hypothesis (Share, 1995), phonological decoding is fundamental to acquiring orthographic representations of novel written words. However, phonological decoding is not straightforward in non-alphabetic scripts such as Chinese, where words are presented as characters. Here, we present the first study investigating the role of phonological decoding in orthographic learning in Chinese. We examined two possible types of phonological decoding: the use of phonetic radicals, an internal phonological aid, andthe use of Zhuyin, an external phonological coding system. Seventy-three Grade 2 children were taught the pronunciations and meanings of twelve novel compound characters over four days. They were then exposed to the written characters in short stories, and were assessed on their reading accuracy and on their subsequent orthographic learning via orthographic choice and spelling tasks. The novel characters were assigned three different types of pronunciation in relation to its phonetic radical - (1) a pronunciation that is identical to the phonetic radical in isolation; (2) a common alternative pronunciation associated with the phonetic radical when it appears in other characters; and (3) a pronunciation that is unrelated to the phonetic radical. The presence of Zhuyin was also manipulated. The children read the novel characters more accurately when phonological cues from the phonetic radicals were available and in the presence of Zhuyin. However, only the phonetic radicals facilitated orthographic learning. The findings provide the first empirical evidence of orthographic learning via self-teaching in Chinese, and reveal how phonological decoding functions to support learning in non-alphabetic writing systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Effect of Inter-sentential vs Intra-sentential Code-Switching: With a Focus on Past Tense

    Directory of Open Access Journals (Sweden)

    Hanieh Kashi

    2018-03-01

    Full Text Available The current study aimed at the comparative effect of inter-sentential vs intra-sentential code-switching on learning past tense. Initially, through non-random convenient sampling, the researcher chose 90 female EFL learners at the elementary level. Next, Key English Test (KET was administered to the 90 learners and the results were used to select 60 participants for the purpose of this study. The participants were then divided into two groups each consisting of 30 learners. Afterwards, a grammar pretest having 30 items focusing on past simple tense was given to both groups. Following that, the grammatical explanations were provided for the two groups for ten sessions using code-switching.  The first experimental group received inter-sentential code switching in line with Reyes’s (2004 as a switch between two languages, where a sentence in one of the languages is completed and the next sentence starts with the other language (Reyes, 2004. In the second experimental group, in line with Reyes’s (2004, the switching occurred within a sentence. The results of statistical analysis indicated that inter-sentential code-switching proved more effective compared to intra-sentential code-switching on the learning of past tense by EFL learners. Based on the findings of the present study, EFL teachers are encouraged to use inter-sentential code-switching more compared to intra-sentential code-switching when it comes to teaching grammar.

  9. Towards a universal code formatter through machine learning

    NARCIS (Netherlands)

    Parr, T. (Terence); J.J. Vinju (Jurgen)

    2016-01-01

    textabstractThere are many declarative frameworks that allow us to implement code formatters relatively easily for any specific language, but constructing them is cumbersome. The first problem is that "everybody" wants to format their code differently, leading to either many formatter variants or a

  10. Office of Codes and Standards resource book. Section 1, Building energy codes and standards

    Energy Technology Data Exchange (ETDEWEB)

    Hattrup, M.P.

    1995-01-01

    The US Department of Energy`s (DOE`s) Office of Codes and Standards has developed this Resource Book to provide: A discussion of DOE involvement in building codes and standards; a current and accurate set of descriptions of residential, commercial, and Federal building codes and standards; information on State contacts, State code status, State building construction unit volume, and State needs; and a list of stakeholders in the building energy codes and standards arena. The Resource Book is considered an evolving document and will be updated occasionally. Users are requested to submit additional data (e.g., more current, widely accepted, and/or documented data) and suggested changes to the address listed below. Please provide sources for all data provided.

  11. TOOKUIL: A case study in user interface development for safety code application

    International Nuclear Information System (INIS)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-01-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today's safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL

  12. TOOKUIL: A case study in user interface development for safety code application

    International Nuclear Information System (INIS)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.; Peebles, R.C.; Smith, R.J.

    1996-11-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today's safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL

  13. Lessons learned from new construction utility demand side management programs and their implications for implementing building energy codes

    Energy Technology Data Exchange (ETDEWEB)

    Wise, B.K.; Hughes, K.R.; Danko, S.L.; Gilbride, T.L.

    1994-07-01

    This report was prepared for the US Department of Energy (DOE) Office of Codes and Standards by the Pacific Northwest Laboratory (PNL) through its Building Energy Standards Program (BESP). The purpose of this task was to identify demand-side management (DSM) strategies for new construction that utilities have adopted or developed to promote energy-efficient design and construction. PNL conducted a survey of utilities and used the information gathered to extrapolate lessons learned and to identify evolving trends in utility new-construction DSM programs. The ultimate goal of the task is to identify opportunities where states might work collaboratively with utilities to promote the adoption, implementation, and enforcement of energy-efficient building energy codes.

  14. Fast Convolutional Sparse Coding in the Dual Domain

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2017-01-01

    Convolutional sparse coding (CSC) is an important building block of many computer vision applications ranging from image and video compression to deep learning. We present two contributions to the state of the art in CSC. First, we significantly speed up the computation by proposing a new optimization framework that tackles the problem in the dual domain. Second, we extend the original formulation to higher dimensions in order to process a wider range of inputs, such as color inputs, or HOG features. Our results show a significant speedup compared to the current state of the art in CSC.

  15. Fast Convolutional Sparse Coding in the Dual Domain

    KAUST Repository

    Affara, Lama Ahmed

    2017-09-27

    Convolutional sparse coding (CSC) is an important building block of many computer vision applications ranging from image and video compression to deep learning. We present two contributions to the state of the art in CSC. First, we significantly speed up the computation by proposing a new optimization framework that tackles the problem in the dual domain. Second, we extend the original formulation to higher dimensions in order to process a wider range of inputs, such as color inputs, or HOG features. Our results show a significant speedup compared to the current state of the art in CSC.

  16. Effects of the 2013 Psychiatric Current Procedural Terminology Codes Revision on Psychotherapy in Psychiatric Billing.

    Science.gov (United States)

    Mark, Tami L; Olesiuk, William J; Sherman, Laura J; Ali, Mir M; Mutter, Ryan; Teich, Judith L

    2017-11-01

    The purpose of this study was to determine whether the changes to the psychiatric Current Procedural Terminology (CPT) codes implemented in 2013 were associated with changes in types of services for which psychiatrists billed. Analyses were conducted using paid private insurance claims from a large commercial database. The participant cohort comprised psychiatrists with at least one psychiatry visit reported in the database in each calendar year studied: 2012 (N of visits=778,445), 2013 (N=748,317), and 2014 (N=754,760). The percentage of visits in which psychiatrists billed for psychotherapy declined from 51.4% in 2012 to 42.1% in 2014. The decline held after the analyses adjusted for patient characteristics, plan type, and region. The update to CPT codes resulted in a decrease in visits for which psychiatrists billed for psychotherapy. Further research should explore whether the change in billing corresponds to changes in service delivery.

  17. Learning Concepts, Language, and Literacy in Hybrid Linguistic Codes: The Multilingual Maze of Urban Grade 1 Classrooms in South Africa

    Science.gov (United States)

    Henning, Elizabeth

    2012-01-01

    From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…

  18. Current and anticipated uses of the thermal hydraulics codes at the NRC

    Energy Technology Data Exchange (ETDEWEB)

    Caruso, R.

    1997-07-01

    The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of {open_quotes}Design Basis Accidents,{close_quotes}, and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support these needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users.

  19. Current and anticipated uses of the thermal hydraulics codes at the NRC

    International Nuclear Information System (INIS)

    Caruso, R.

    1997-01-01

    The focus of Thermal-Hydraulic computer code usage in nuclear regulatory organizations has undergone a considerable shift since the codes were originally conceived. Less work is being done in the area of Design Basis Accidents, , and much more emphasis is being placed on analysis of operational events, probabalistic risk/safety assessment, and maintenance practices. All of these areas need support from Thermal-Hydraulic computer codes to model the behavior of plant fluid systems, and they all need the ability to perform large numbers of analyses quickly. It is therefore important for the T/H codes of the future to be able to support these needs, by providing robust, easy-to-use, tools that produce easy-to understand results for a wider community of nuclear professionals. These tools need to take advantage of the great advances that have occurred recently in computer software, by providing users with graphical user interfaces for both input and output. In addition, reduced costs of computer memory and other hardware have removed the need for excessively complex data structures and numerical schemes, which make the codes more difficult and expensive to modify, maintain, and debug, and which increase problem run-times. Future versions of the T/H codes should also be structured in a modular fashion, to allow for the easy incorporation of new correlations, models, or features, and to simplify maintenance and testing. Finally, it is important that future T/H code developers work closely with the code user community, to ensure that the code meet the needs of those users

  20. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  1. CodeRAnts: A recommendation method based on collaborative searching and ant colonies, applied to reusing of open source code

    Directory of Open Access Journals (Sweden)

    Isaac Caicedo-Castro

    2014-01-01

    Full Text Available This paper presents CodeRAnts, a new recommendation method based on a collaborative searching technique and inspired on the ant colony metaphor. This method aims to fill the gap in the current state of the matter regarding recommender systems for software reuse, for which prior works present two problems. The first is that, recommender systems based on these works cannot learn from the collaboration of programmers and second, outcomes of assessments carried out on these systems present low precision measures and recall and in some of these systems, these metrics have not been evaluated. The work presented in this paper contributes a recommendation method, which solves these problems.

  2. TRAC code development status and plans

    International Nuclear Information System (INIS)

    Spore, J.W.; Liles, D.R.; Nelson, R.A.

    1986-01-01

    This report summarizes the characteristics and current status of the TRAC-PF1/MOD1 computer code. Recent error corrections and user-convenience features are described, and several user enhancements are identified. Current plans for the release of the TRAC-PF1/MOD2 computer code and some preliminary MOD2 results are presented. This new version of the TRAC code implements stability-enhancing two-step numerics into the 3-D vessel, using partial vectorization to obtain a code that has run 400% faster than the MOD1 code

  3. Artificial Intelligence Learning Semantics via External Resources for Classifying Diagnosis Codes in Discharge Notes.

    Science.gov (United States)

    Lin, Chin; Hsu, Chia-Jung; Lou, Yu-Sheng; Yeh, Shih-Jen; Lee, Chia-Cheng; Su, Sui-Lung; Chen, Hsiang-Cheng

    2017-11-06

    Automated disease code classification using free-text medical information is important for public health surveillance. However, traditional natural language processing (NLP) pipelines are limited, so we propose a method combining word embedding with a convolutional neural network (CNN). Our objective was to compare the performance of traditional pipelines (NLP plus supervised machine learning models) with that of word embedding combined with a CNN in conducting a classification task identifying International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) diagnosis codes in discharge notes. We used 2 classification methods: (1) extracting from discharge notes some features (terms, n-gram phrases, and SNOMED CT categories) that we used to train a set of supervised machine learning models (support vector machine, random forests, and gradient boosting machine), and (2) building a feature matrix, by a pretrained word embedding model, that we used to train a CNN. We used these methods to identify the chapter-level ICD-10-CM diagnosis codes in a set of discharge notes. We conducted the evaluation using 103,390 discharge notes covering patients hospitalized from June 1, 2015 to January 31, 2017 in the Tri-Service General Hospital in Taipei, Taiwan. We used the receiver operating characteristic curve as an evaluation measure, and calculated the area under the curve (AUC) and F-measure as the global measure of effectiveness. In 5-fold cross-validation tests, our method had a higher testing accuracy (mean AUC 0.9696; mean F-measure 0.9086) than traditional NLP-based approaches (mean AUC range 0.8183-0.9571; mean F-measure range 0.5050-0.8739). A real-world simulation that split the training sample and the testing sample by date verified this result (mean AUC 0.9645; mean F-measure 0.9003 using the proposed method). Further analysis showed that the convolutional layers of the CNN effectively identified a large number of keywords and automatically

  4. Investigating the use of quick response codes in the gross anatomy laboratory.

    Science.gov (United States)

    Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n = 155) and doctor of physical therapy (n = 39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P = 0.113), and no significant gains in performance (P = 0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. © 2014 American Association of Anatomists.

  5. Suture Coding: A Novel Educational Guide for Suture Patterns.

    Science.gov (United States)

    Gaber, Mohamed; Abdel-Wahed, Ramadan

    2015-01-01

    This study aims to provide a helpful guide to perform tissue suturing successfully using suture coding-a method for identification of suture patterns and techniques by giving full information about the method of application of each pattern using numbers and symbols. Suture coding helps construct an infrastructure for surgical suture science. It facilitates the easy understanding and learning of suturing techniques and patterns as well as detects the relationship between the different patterns. Guide points are fixed on both edges of the wound to act as a guideline to help practice suture pattern techniques. The arrangement is fixed as 1-3-5-7 and a-c-e-g on one side (whether right or left) and as 2-4-6-8 and b-d-f-h on the other side. Needle placement must start from number 1 or letter "a" and continue to follow the code till the end of the stitching. Some rules are created to be adopted for the application of suture coding. A suture trainer containing guide points that simulate the coding process is used to facilitate the learning of the coding method. (120) Is the code of simple interrupted suture pattern; (ab210) is the code of vertical mattress suture pattern, and (013465)²/3 is the code of Cushing suture pattern. (0A1) Is suggested as a surgical suture language that gives the name and type of the suture pattern used to facilitate its identification. All suture patterns known in the world should start with (0), (A), or (1). There is a relationship between 2 or more surgical patterns according to their codes. It can be concluded that every suture pattern has its own code that helps in the identification of its type, structure, and method of application. Combination between numbers and symbols helps in the understanding of suture techniques easily without complication. There are specific relationships that can be identified between different suture patterns. Coding methods facilitate suture patterns learning process. The use of suture coding can be a good

  6. Cost reducing code implementation strategies

    International Nuclear Information System (INIS)

    Kurtz, Randall L.; Griswold, Michael E.; Jones, Gary C.; Daley, Thomas J.

    1995-01-01

    Sargent and Lundy's Code consulting experience reveals a wide variety of approaches toward implementing the requirements of various nuclear Codes Standards. This paper will describe various Code implementation strategies which assure that Code requirements are fully met in a practical and cost-effective manner. Applications to be discussed includes the following: new construction; repair, replacement and modifications; assessments and life extensions. Lessons learned and illustrative examples will be included. Preferred strategies and specific recommendations will also be addressed. Sargent and Lundy appreciates the opportunity provided by the Korea Atomic Industrial Forum and Korean Nuclear Society to share our ideas and enhance global cooperation through the exchange of information and views on relevant topics

  7. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  8. TOOKUIL: A case study in user interface development for safety code application

    Energy Technology Data Exchange (ETDEWEB)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G. [and others

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interface named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.

  9. Deep Learning in Nuclear Medicine and Molecular Imaging: Current Perspectives and Future Directions.

    Science.gov (United States)

    Choi, Hongyoon

    2018-04-01

    Recent advances in deep learning have impacted various scientific and industrial fields. Due to the rapid application of deep learning in biomedical data, molecular imaging has also started to adopt this technique. In this regard, it is expected that deep learning will potentially affect the roles of molecular imaging experts as well as clinical decision making. This review firstly offers a basic overview of deep learning particularly for image data analysis to give knowledge to nuclear medicine physicians and researchers. Because of the unique characteristics and distinctive aims of various types of molecular imaging, deep learning applications can be different from other fields. In this context, the review deals with current perspectives of deep learning in molecular imaging particularly in terms of development of biomarkers. Finally, future challenges of deep learning application for molecular imaging and future roles of experts in molecular imaging will be discussed.

  10. Theory of Digital Natives in the Light of Current and Future E-Learning Concepts

    Directory of Open Access Journals (Sweden)

    Bodo von der Heiden

    2011-06-01

    Full Text Available The digital generation has many names: Net Generation, Generation@ or Digital Natives. The meaning behind these terms is, that the current generation of students is digitally and media literate, technology-savvy and are able to use other learning approaches than former generations. But these topics are discussed controversial and even the cause-effect-relationship is not as clear as it seems. Did the digital generation really have other learning approaches, or do they have only the possibility to live other learning modes? Against this background this article tries to shed some light on this debate. Therefore we use current and future projects performed at RWTH Aachen University to illustrate the relevance, value and significance due to the theory of the digital natives.

  11. Temperature and current dependent electroluminescence measurements on colour-coded multiple quantum well light emitting diodes

    Energy Technology Data Exchange (ETDEWEB)

    Bergbauer, Werner [OSRAM Opto Semiconductors GmbH, Regensburg (Germany); FH Deggendorf (Germany); Laubsch, Ansgar; Peter, Matthias; Mayer, Tobias; Bader, Stefan; Oberschmid, Raimund; Hahn, Berthold [OSRAM Opto Semiconductors GmbH, Regensburg (Germany); Benstetter, Guenther [FH Deggendorf (Germany)

    2008-07-01

    As the efficiency and the luminous flux have been increased enormously in the last few years, today Light Emitting Diodes (LEDs) are even pushed to applications like general lighting and Home Cinema Projection. Still, InGaN/GaN heterostructure based LEDs suffer from loss-mechanisms like non-radiative defect and Auger recombination, carrier leakage and piezo-field induced carrier separation. To optimize the high current efficiency we evaluated the benefit of Multiple Quantum Well (MQW) compared to Single Quantum Well (SQW) LEDs. Temperature dependent electroluminescence of colour-coded structures with different Indium content in certain Quantum Wells was measured. The experiments demonstrated a strong temperature and current dependence of the MQW operation. The comparison between different LED structures showed effectively the increased LED performance of those structures which operate with a well adjusted MQW active area. Due to the enhanced carrier distribution in the high current range, these LEDs show a higher light output and additionally a reduced wavelength shift.

  12. Temperature and current dependent electroluminescence measurements on colour-coded multiple quantum well light emitting diodes

    International Nuclear Information System (INIS)

    Bergbauer, Werner; Laubsch, Ansgar; Peter, Matthias; Mayer, Tobias; Bader, Stefan; Oberschmid, Raimund; Hahn, Berthold; Benstetter, Guenther

    2008-01-01

    As the efficiency and the luminous flux have been increased enormously in the last few years, today Light Emitting Diodes (LEDs) are even pushed to applications like general lighting and Home Cinema Projection. Still, InGaN/GaN heterostructure based LEDs suffer from loss-mechanisms like non-radiative defect and Auger recombination, carrier leakage and piezo-field induced carrier separation. To optimize the high current efficiency we evaluated the benefit of Multiple Quantum Well (MQW) compared to Single Quantum Well (SQW) LEDs. Temperature dependent electroluminescence of colour-coded structures with different Indium content in certain Quantum Wells was measured. The experiments demonstrated a strong temperature and current dependence of the MQW operation. The comparison between different LED structures showed effectively the increased LED performance of those structures which operate with a well adjusted MQW active area. Due to the enhanced carrier distribution in the high current range, these LEDs show a higher light output and additionally a reduced wavelength shift

  13. Spallation neutron production and the current intra-nuclear cascade and transport codes

    International Nuclear Information System (INIS)

    Filges, D.; Goldenbaum, F.

    2001-01-01

    A recent renascent interest in energetic proton-induced production of neutrons originates largely from the inception of projects for target stations of intense spallation neutron sources, like the planned European Spallation Source (ESS), accelerator-driven nuclear reactors, nuclear waste transmutation, and also from the application for radioactive beams. In the framework of such a neutron production, of major importance is the search for ways for the most efficient conversion of the primary beam energy into neutron production. Although the issue has been quite successfully addressed experimentally by varying the incident proton energy for various target materials and by covering a huge collection of different target geometries --providing an exhaustive matrix of benchmark data-- the ultimate challenge is to increase the predictive power of transport codes currently on the market. To scrutinize these codes, calculations of reaction cross-sections, hadronic interaction lengths, average neutron multiplicities, neutron multiplicity and energy distributions, and the development of hadronic showers are confronted with recent experimental data of the NESSI collaboration. Program packages like HERMES, LCS or MCNPX master the prevision of reaction cross-sections, hadronic interaction lengths, averaged neutron multiplicities and neutron multiplicity distributions in thick and thin targets for a wide spectrum of incident proton energies, geometrical shapes and materials of the target generally within less than 10% deviation, while production cross-section measurements for light charged particles on thin targets point out that appreciable distinctions exist within these models. (orig.)

  14. Non-coding RNAs and plant male sterility: current knowledge and future prospects.

    Science.gov (United States)

    Mishra, Ankita; Bohra, Abhishek

    2018-02-01

    Latest outcomes assign functional role to non-coding (nc) RNA molecules in regulatory networks that confer male sterility to plants. Male sterility in plants offers great opportunity for improving crop performance through application of hybrid technology. In this respect, cytoplasmic male sterility (CMS) and sterility induced by photoperiod (PGMS)/temperature (TGMS) have greatly facilitated development of high-yielding hybrids in crops. Participation of non-coding (nc) RNA molecules in plant reproductive development is increasingly becoming evident. Recent breakthroughs in rice definitively associate ncRNAs with PGMS and TGMS. In case of CMS, the exact mechanism through which the mitochondrial ORFs exert influence on the development of male gametophyte remains obscure in several crops. High-throughput sequencing has enabled genome-wide discovery and validation of these regulatory molecules and their target genes, describing their potential roles performed in relation to CMS. Discovery of ncRNA localized in plant mtDNA with its possible implication in CMS induction is intriguing in this respect. Still, conclusive evidences linking ncRNA with CMS phenotypes are currently unavailable, demanding complementing genetic approaches like transgenics to substantiate the preliminary findings. Here, we review the recent literature on the contribution of ncRNAs in conferring male sterility to plants, with an emphasis on microRNAs. Also, we present a perspective on improved understanding about ncRNA-mediated regulatory pathways that control male sterility in plants. A refined understanding of plant male sterility would strengthen crop hybrid industry to deliver hybrids with improved performance.

  15. Spallation neutron production and the current intra-nuclear cascade and transport codes

    Science.gov (United States)

    Filges, D.; Goldenbaum, F.; Enke, M.; Galin, J.; Herbach, C.-M.; Hilscher, D.; Jahnke, U.; Letourneau, A.; Lott, B.; Neef, R.-D.; Nünighoff, K.; Paul, N.; Péghaire, A.; Pienkowski, L.; Schaal, H.; Schröder, U.; Sterzenbach, G.; Tietze, A.; Tishchenko, V.; Toke, J.; Wohlmuther, M.

    A recent renascent interest in energetic proton-induced production of neutrons originates largely from the inception of projects for target stations of intense spallation neutron sources, like the planned European Spallation Source (ESS), accelerator-driven nuclear reactors, nuclear waste transmutation, and also from the application for radioactive beams. In the framework of such a neutron production, of major importance is the search for ways for the most efficient conversion of the primary beam energy into neutron production. Although the issue has been quite successfully addressed experimentally by varying the incident proton energy for various target materials and by covering a huge collection of different target geometries --providing an exhaustive matrix of benchmark data-- the ultimate challenge is to increase the predictive power of transport codes currently on the market. To scrutinize these codes, calculations of reaction cross-sections, hadronic interaction lengths, average neutron multiplicities, neutron multiplicity and energy distributions, and the development of hadronic showers are confronted with recent experimental data of the NESSI collaboration. Program packages like HERMES, LCS or MCNPX master the prevision of reaction cross-sections, hadronic interaction lengths, averaged neutron multiplicities and neutron multiplicity distributions in thick and thin targets for a wide spectrum of incident proton energies, geometrical shapes and materials of the target generally within less than 10% deviation, while production cross-section measurements for light charged particles on thin targets point out that appreciable distinctions exist within these models.

  16. Efficient Coding and Energy Efficiency Are Promoted by Balanced Excitatory and Inhibitory Synaptic Currents in Neuronal Network.

    Science.gov (United States)

    Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo

    2018-01-01

    Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy

  17. Study of counter current flow limitation model of MARS-KS and SPACE codes under Dukler's air/water flooding test conditions

    International Nuclear Information System (INIS)

    Lee, Won Woong; Kim, Min Gil; Lee, Jeong Ik; Bang, Young Seok

    2015-01-01

    In particular, CCFL(the counter current flow limitation) occurs in components such as hot leg, downcomer annulus and steam generator inlet plenum during LOCA which is possible to have flows in two opposite directions. Therefore, CCFL is one of the thermal-hydraulic models which has significant effect on the reactor safety analysis code performance. In this study, the CCFL model will be evaluated with MARS-KS based on two-phase two-field governing equations and SPACE code based on two-phase three-field governing equations. This study will be conducted by comparing MARS-KS code which is being used for evaluating the safety of a Korean Nuclear Power Plant and SPACE code which is currently under assessment for evaluating the safety of the designed nuclear power plant. In this study, comparison of the results of liquid upflow and liquid downflow rate for different gas flow rate from two code to the famous Dukler's CCFL experimental data are presented. This study will be helpful to understand the difference between system analysis codes with different governing equations, models and correlations, and further improving the accuracy of system analysis codes. In the nuclear reactor system, CCFL is an important phenomenon for evaluating the safety of nuclear reactors. This is because CCFL phenomenon can limit injection of ECCS water when CCFL occurs in components such as hot leg, downcomer annulus or steam generator inlet plenum during LOCA which is possible to flow in two opposite directions. Therefore, CCFL is one of the thermal-hydraulic models which has significant effect on the reactor safety analysis code performance. In this study, the CCFL model was evaluated with MARS-KS and SPACE codes for studying the difference between system analysis codes with different governing equations, models and correlations. This study was conducted by comparing MARS-KS and SPACE code results of liquid upflow and liquid downflow rate for different gas flow rate to the famous Dukler

  18. Error Resilience in Current Distributed Video Coding Architectures

    Directory of Open Access Journals (Sweden)

    Tonoli Claudia

    2009-01-01

    Full Text Available In distributed video coding the signal prediction is shifted at the decoder side, giving therefore most of the computational complexity burden at the receiver. Moreover, since no prediction loop exists before transmission, an intrinsic robustness to transmission errors has been claimed. This work evaluates and compares the error resilience performance of two distributed video coding architectures. In particular, we have considered a video codec based on the Stanford architecture (DISCOVER codec and a video codec based on the PRISM architecture. Specifically, an accurate temporal and rate/distortion based evaluation of the effects of the transmission errors for both the considered DVC architectures has been performed and discussed. These approaches have been also compared with H.264/AVC, in both cases of no error protection, and simple FEC error protection. Our evaluations have highlighted in all cases a strong dependence of the behavior of the various codecs to the content of the considered video sequence. In particular, PRISM seems to be particularly well suited for low-motion sequences, whereas DISCOVER provides better performance in the other cases.

  19. Applying machine learning to predict patient-specific current CD 4 ...

    African Journals Online (AJOL)

    This work shows the application of machine learning to predict current CD4 cell count of an HIV-positive patient using genome sequences, viral load and time. A regression model predicting actual CD4 cell counts and a classification model predicting if a patient's CD4 cell count is less than 200 was built using a support ...

  20. Performance evaluation based on data from code reviews

    OpenAIRE

    Andrej, Sekáč

    2016-01-01

    Context. Modern code review tools such as Gerrit have made available great amounts of code review data from different open source projects as well as other commercial projects. Code reviews are used to keep the quality of produced source code under control but the stored data could also be used for evaluation of the software development process. Objectives. This thesis uses machine learning methods for an approximation of review expert’s performance evaluation function. Due to limitations in ...

  1. ABCXYZ: vector potential (A) and magnetic field (B) code (C) for Cartesian (XYZ) geometry using general current elements

    International Nuclear Information System (INIS)

    Anderson, D.V.; Breazeal, J.; Finan, C.H.; Johnston, B.M.

    1976-01-01

    ABCXYZ is a computer code for obtaining the Cartesian components of the vector potential and the magnetic field on an observed grid from an arrangement of current-carrying wires. Arbitrary combinations of straight line segments, arcs, and loops are allowed in the specification of the currents. Arbitrary positions and orientations of the current-carrying elements are also allowed. Specification of the wire diameter permits the computation of well-defined fields, even in the interiors of the conductors. An optical feature generates magnetic field lines. Extensive graphical and printed output is available to the user including contour, grid-line, and field-line plots. 12 figures, 1 table

  2. Kinetic instabilities of thin current sheets: Results of two-and-one-half-dimensional Vlasov code simulations

    International Nuclear Information System (INIS)

    Silin, I.; Buechner, J.

    2003-01-01

    Nonlinear triggering of the instability of thin current sheets is investigated by two-and-one-half- dimensional Vlasov code simulations. A global drift-resonant instability (DRI) is found, which results from the lower-hybrid-drift waves penetrating from the current sheet edges to the center where they resonantly interact with unmagnetized ions. This resonant nonlinear instability grows faster than a Kelvin-Helmholtz instability obtained in previous studies. The DRI is either asymmetric or symmetric mode or a combination of the two, depending on the relative phase of the lower-hybrid-drift waves at the edges of the current sheet. With increasing particle mass ratio the wavenumber of the fastest-growing mode increases as kL z ∼(m i /m e ) 1/2 /2 and the growth rate of the DRI saturates at a finite level

  3. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...

  4. Simulation of lower hybrid current drive in enhanced reversed shear plasmas in the tokamak fusion test reactor using the lower hybrid simulation code

    International Nuclear Information System (INIS)

    Kaita, R.; Bernabei, S.; Budny, R.

    1996-01-01

    The Enhanced Reversed Shear (ERS) mode has already shown great potential for improving the performance of the Tokamak Fusion Test Reactor (TFTR) and other devices. Sustaining the ERS, however, remains an outstanding problem. Lower hybrid (LH) current drive is a possible method for modifying the current profile and controlling its time evolution. To predict its effectiveness in TFTR, the Lower Hybrid Simulation Code (LSC) model is used in the TRANSP code and the Tokamak Simulation Code (TSC). Among the results from the simulations are the following. (1) Single-pass absorption is expected in TFTR ERS plasmas. The simulations show that the LH current follows isotherms of the electron temperature. The ability to control the location of the minimum in the q profile (q min ) has been demonstrated by varying the phase velocity of the launched LH waves and observing the change in the damping location. (2) LH current drive can been used to sustain the q min location. The tendency of qmin to drift inward, as the inductive current diffuses during the formation phase of the reversed shear discharge, is prevented by the LH current driven at a fixed radial location. If this results in an expanded plasma volume with improved confinement as high power neutral beam injection is applied, the high bootstrap currents induced during this phase can then maintain the larger qmin radius. (3) There should be no LH wave damping on energetic beam particles. The values of perpendicular index of refraction in the calculations never exceed about 20, while ions at TFR injection energies are resonant with waves having values closer to 100. Other issues being addressed in the study include the LH current drive efficiency in the presence of high bootstrap currents, and the effect of fast electron diffusion on LH current localization

  5. Modeling cross-field drifts and current with the B2 code for the CIT divertor

    International Nuclear Information System (INIS)

    Rognlien, T.D.; Milovich, J.L.; Rensink, M.E.

    1990-01-01

    We have modified the B2 edge-plasma code to include the effects of classical fluid drifts across the magnetic field lines and plasma currents. This report presents preliminary results of these effects for the CIT parameter regime. The basic plasma model described by Braams involves solving the continuity equation, the parallel momentum balance equation, and separate energy balance equations for the ions and the electrons. If multiple ion species are present, they are all assumed to have a common temperature, but their densities and parallel velocities are solved for using additional continuity and parallel momentum balance equations for each species. Momentum and heat transport parallel to the magnetic field, B, are given by the classical collisional theory. On the other hand, transport perpendicular to B is represented by anomalous diffusion coefficients which are adjusted to agree with experimental measurements. These transport coefficients are generally taken to be constant in radius and poloidal angle, although this is not necessary. The goal of our work has been to include both the classical cross-field drift terms and the effects of parallel currents in the equations used in the B2 code. The motivation for including the cross-field terms comes from simple model calculations which indicate that the classical flows can contribute an important asymmetry which may help explain the transition from L-mode to H-mode confinement. Radial electric fields which arise near the separatrix cause E x B poloidal rotation which may also be related to the L-to-H mode transition through its effect on edge turbulence. Including the parallel currents is done to provide a tool for understanding the biased divertor experiments on DIII-D at General Atomics. Such biasing may provide an effective means of controlling the asymmetry of the power flow to different divertor plates

  6. Hebbian learning in a model with dynamic rate-coded neurons: an alternative to the generative model approach for learning receptive fields from natural scenes.

    Science.gov (United States)

    Hamker, Fred H; Wiltschut, Jan

    2007-09-01

    Most computational models of coding are based on a generative model according to which the feedback signal aims to reconstruct the visual scene as close as possible. We here explore an alternative model of feedback. It is derived from studies of attention and thus, probably more flexible with respect to attentive processing in higher brain areas. According to this model, feedback implements a gain increase of the feedforward signal. We use a dynamic model with presynaptic inhibition and Hebbian learning to simultaneously learn feedforward and feedback weights. The weights converge to localized, oriented, and bandpass filters similar as the ones found in V1. Due to presynaptic inhibition the model predicts the organization of receptive fields within the feedforward pathway, whereas feedback primarily serves to tune early visual processing according to the needs of the task.

  7. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    Science.gov (United States)

    Młynarski, Wiktor

    2014-01-01

    To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficient coding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform—Independent Component Analysis (ICA) trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment. PMID:24639644

  8. Linguistic coding deficits in foreign language learners.

    Science.gov (United States)

    Sparks, R; Ganschow, L; Pohlman, J

    1989-01-01

    As increasing numbers of colleges and universities require a foreign language for graduation in at least one of their degree programs, reports of students with difficulties in learning a second language are multiplying. Until recently, little research has been conducted to identify the nature of this problem. Recent attempts by the authors have focused upon subtle but ongoing language difficulties in these individuals as the source of their struggle to learn a foreign language. The present paper attempts to expand upon this concept by outlining a theoretical framework based upon a linguistic coding model that hypothesizes deficits in the processing of phonological, syntactic, and/or semantic information. Traditional psychoeducational assessment batteries of standardized intelligence and achievement tests generally are not sensitive to these linguistic coding deficits unless closely analyzed or, more often, used in conjunction with a more comprehensive language assessment battery. Students who have been waived from a foreign language requirement and their proposed type(s) of linguistic coding deficits are profiled. Tentative conclusions about the nature of these foreign language learning deficits are presented along with specific suggestions for tests to be used in psychoeducational evaluations.

  9. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  10. Overview of codes and tools for nuclear engineering education

    Science.gov (United States)

    Yakovlev, D.; Pryakhin, A.; Medvedeva, L.

    2017-01-01

    The recent world trends in nuclear education have been developed in the direction of social education, networking, virtual tools and codes. MEPhI as a global leader on the world education market implements new advanced technologies for the distance and online learning and for student research work. MEPhI produced special codes, tools and web resources based on the internet platform to support education in the field of nuclear technology. At the same time, MEPhI actively uses codes and tools from the third parties. Several types of the tools are considered: calculation codes, nuclear data visualization tools, virtual labs, PC-based educational simulators for nuclear power plants (NPP), CLP4NET, education web-platforms, distance courses (MOOCs and controlled and managed content systems). The university pays special attention to integrated products such as CLP4NET, which is not a learning course, but serves to automate the process of learning through distance technologies. CLP4NET organizes all tools in the same information space. Up to now, MEPhI has achieved significant results in the field of distance education and online system implementation.

  11. Large-Signal Code TESLA: Current Status and Recent Development

    National Research Council Canada - National Science Library

    Chernyavskiy, Igor A; Vlasov, Alexander N; Cooke, Simon J; Abe, David K; Levush, Baruch; Antonsen, Jr., Thomas M; Nguyen, Khanh T

    2008-01-01

    .... One such tool is the large-signal code TESLA, which was successfully applied for the modeling of single-beam and multiple-beam klystron devices at the Naval Research Laboratory and which is now used by number of U.S. companies...

  12. Quality management of eLearning for medical education: current situation and outlook.

    Science.gov (United States)

    Abrusch, Jasmin; Marienhagen, Jörg; Böckers, Anja; Gerhardt-Szép, Susanne

    2015-01-01

    In 2008, the German Council of Science had advised universities to establish a quality management system (QMS) that conforms to international standards. The system was to be implemented within 5 years, i.e., until 2014 at the latest. The aim of the present study was to determine whether a QMS suitable for electronic learning (eLearning) domain of medical education to be used across Germany has meanwhile been identified. We approached all medical universities in Germany (n=35), using an anonymous questionnaire (8 domains, 50 items). Our results (response rate 46.3%) indicated very reluctant application of QMS in eLearning and a major information deficit at the various institutions. Authors conclude that under the limitations of this study there seems to be a considerable need to improve the current knowledge on QMS for eLearning, and that clear guidelines and standards for their implementation should be further defined.

  13. Learning binary code via PCA of angle projection for image retrieval

    Science.gov (United States)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  14. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  15. Summary of the Current Status of Lessons Learned From Fukushima Accident

    International Nuclear Information System (INIS)

    Pasamehmetoglu, Kemal

    2013-01-01

    This presentation introduced the current status of the lessons learned from the Fukushima accident, and in particular, the recommendations released by a NRC Near-term Task Force to enhance reactor safety in the 21. century. The near-term recommendations are focused on emergency power and emergency cooling availability during station blackout accidents

  16. Navigating the changing learning landscape: perspective from bioinformatics.ca.

    Science.gov (United States)

    Brazas, Michelle D; Ouellette, B F Francis

    2013-09-01

    With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable in the learning continuum. Bioinformatics.ca, which hosts the Canadian Bioinformatics Workshops, has blended more traditional learning styles with current online and social learning styles. Here we share our growing experiences over the past 12 years and look toward what the future holds for bioinformatics training programs.

  17. The Future of e-Learning in Medical Education: Current Trend and Future Opportunity

    Directory of Open Access Journals (Sweden)

    Sara Kim

    2006-09-01

    Full Text Available A wide range of e-learning modalities are widely integrated in medical education. However, some of the key questions related to the role of e-learning remain unanswered, such as (1 what is an effective approach to integrating technology into pre-clinical vs. clinical training?; (2 what evidence exists regarding the type and format of e-learning technology suitable for medical specialties and clinical settings?; (3 which design features are known to be effective in designing on-line patient simulation cases, tutorials, or clinical exams?; and (4 what guidelines exist for determining an appropriate blend of instructional strategies, including online learning, face-to-face instruction, and performance-based skill practices? Based on the existing literature and a variety of e-learning examples of synchronous learning tools and simulation technology, this paper addresses the following three questions: (1 what is the current trend of e-learning in medical education?; (2 what do we know about the effective use of e-learning?; and (3 what is the role of e-learning in facilitating newly emerging competency-based training? As e-learning continues to be widely integrated in training future physicians, it is critical that our efforts in conducting evaluative studies should target specific e-learning features that can best mediate intended learning goals and objectives. Without an evolving knowledge base on how best to design e-learning applications, the gap between what we know about technology use and how we deploy e-learning in training settings will continue to widen.

  18. Effects of transcranial direct current stimulation on motor learning in healthy individuals: a systematic review

    Directory of Open Access Journals (Sweden)

    Águida Foerster

    Full Text Available Introduction Transcranial direct current stimulation (tDCS has been used to modify cortical excitability and promote motor learning. Objective To systematically review published data to investigate the effects of transcranial direct current stimulation on motor learning in healthy individuals. Methods Randomized or quasi-randomized studies that evaluated the tDCS effects on motor learning were included and the risk of bias was examined by Cochrane Collaboration’s tool. The following electronic databases were used: PubMed, Scopus, Web of Science, LILACS, CINAHL with no language restriction. Results It was found 160 studies; after reading the title and abstract, 17 of those were selected, but just 4 were included. All studies involved healthy, right-handed adults. All studies assessed motor learning by the Jebsen Taylor Test or by the Serial Finger Tapping Task (SFTT. Almost all studies were randomized and all were blinding for participants. Some studies presented differences at SFTT protocol. Conclusion The result is insufficient to draw conclusions if tDCS influences the motor learning. Furthermore, there was significant heterogeneity of the stimulation parameters used. Further researches are needed to investigate the parameters that are more important for motor learning improvement and measure whether the effects are long-lasting or limited in time.

  19. Offshore Code Comparison Collaboration within IEA Wind Task 23: Phase IV Results Regarding Floating Wind Turbine Modeling; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J.; Larsen, T.; Hansen, A.; Nygaard, T.; Maus, K.; Karimirad, M.; Gao, Z.; Moan, T.; Fylling, I.

    2010-04-01

    Offshore wind turbines are designed and analyzed using comprehensive simulation codes that account for the coupled dynamics of the wind inflow, aerodynamics, elasticity, and controls of the turbine, along with the incident waves, sea current, hydrodynamics, and foundation dynamics of the support structure. This paper describes the latest findings of the code-to-code verification activities of the Offshore Code Comparison Collaboration, which operates under Subtask 2 of the International Energy Agency Wind Task 23. In the latest phase of the project, participants used an assortment of codes to model the coupled dynamic response of a 5-MW wind turbine installed on a floating spar buoy in 320 m of water. Code predictions were compared from load-case simulations selected to test different model features. The comparisons have resulted in a greater understanding of offshore floating wind turbine dynamics and modeling techniques, and better knowledge of the validity of various approximations. The lessons learned from this exercise have improved the participants' codes, thus improving the standard of offshore wind turbine modeling.

  20. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Content Analysis Coding Schemes for Online Asynchronous Discussion

    Science.gov (United States)

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  2. Semantic and phonological coding in poor and normal readers.

    Science.gov (United States)

    Vellutino, F R; Scanlon, D M; Spearing, D

    1995-02-01

    Three studies were conducted evaluating semantic and phonological coding deficits as alternative explanations of reading disability. In the first study, poor and normal readers in second and sixth grade were compared on various tests evaluating semantic development as well as on tests evaluating rapid naming and pseudoword decoding as independent measures of phonological coding ability. In a second study, the same subjects were given verbal memory and visual-verbal learning tasks using high and low meaning words as verbal stimuli and Chinese ideographs as visual stimuli. On the semantic tasks, poor readers performed below the level of the normal readers only at the sixth grade level, but, on the rapid naming and pseudoword learning tasks, they performed below the normal readers at the second as well as at the sixth grade level. On both the verbal memory and visual-verbal learning tasks, performance in poor readers approximated that of normal readers when the word stimuli were high in meaning but not when they were low in meaning. These patterns were essentially replicated in a third study that used some of the same semantic and phonological measures used in the first experiment, and verbal memory and visual-verbal learning tasks that employed word lists and visual stimuli (novel alphabetic characters) that more closely approximated those used in learning to read. It was concluded that semantic coding deficits are an unlikely cause of reading difficulties in most poor readers at the beginning stages of reading skills acquisition, but accrue as a consequence of prolonged reading difficulties in older readers. It was also concluded that phonological coding deficits are a probable cause of reading difficulties in most poor readers.

  3. Direct current induced short-term modulation of the left dorsolateral prefrontal cortex while learning auditory presented nouns

    Directory of Open Access Journals (Sweden)

    Meyer Martin

    2009-07-01

    Full Text Available Abstract Background Little is known about the contribution of transcranial direct current stimulation (tDCS to the exploration of memory functions. The aim of the present study was to examine the behavioural effects of right or left-hemisphere frontal direct current delivery while committing to memory auditory presented nouns on short-term learning and subsequent long-term retrieval. Methods Twenty subjects, divided into two groups, performed an episodic verbal memory task during anodal, cathodal and sham current application on the right or left dorsolateral prefrontal cortex (DLPFC. Results Our results imply that only cathodal tDCS elicits behavioural effects on verbal memory performance. In particular, left-sided application of cathodal tDCS impaired short-term verbal learning when compared to the baseline. We did not observe tDCS effects on long-term retrieval. Conclusion Our results imply that the left DLPFC is a crucial area involved in short-term verbal learning mechanisms. However, we found further support that direct current delivery with an intensity of 1.5 mA to the DLPFC during short-term learning does not disrupt longer lasting consolidation processes that are mainly known to be related to mesial temporal lobe areas. In the present study, we have shown that the tDCS technique has the potential to modulate short-term verbal learning mechanism.

  4. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    Directory of Open Access Journals (Sweden)

    Wiktor eMlynarski

    2014-03-01

    Full Text Available To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficientcoding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform - Independent Component Analysis (ICA trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment.

  5. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  6. Computer codes for safety analysis

    International Nuclear Information System (INIS)

    Holland, D.F.

    1986-11-01

    Computer codes for fusion safety analysis have been under development in the United States for about a decade. This paper will discuss five codes that are currently under development by the Fusion Safety Program. The purpose and capability of each code will be presented, a sample given, followed by a discussion of the present status and future development plans

  7. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  8. Enhanced motor learning following task-concurrent dual transcranial direct current stimulation.

    Directory of Open Access Journals (Sweden)

    Sophia Karok

    Full Text Available OBJECTIVE: Transcranial direct current stimulation (tDCS of the primary motor cortex (M1 has beneficial effects on motor performance and motor learning in healthy subjects and is emerging as a promising tool for motor neurorehabilitation. Applying tDCS concurrently with a motor task has recently been found to be more effective than applying stimulation before the motor task. This study extends this finding to examine whether such task-concurrent stimulation further enhances motor learning on a dual M1 montage. METHOD: Twenty healthy, right-handed subjects received anodal tDCS to the right M1, dual tDCS (anodal current over right M1 and cathodal over left M1 and sham tDCS in a repeated-measures design. Stimulation was applied for 10 mins at 1.5 mA during an explicit motor learning task. Response times (RT and accuracy were measured at baseline, during, directly after and 15 mins after stimulation. Motor cortical excitability was recorded from both hemispheres before and after stimulation using single-pulse transcranial magnetic stimulation. RESULTS: Task-concurrent stimulation with a dual M1 montage significantly reduced RTs by 23% as early as with the onset of stimulation (p<0.01 with this effect increasing to 30% at the final measurement. Polarity-specific changes in cortical excitability were observed with MEPs significantly reduced by 12% in the left M1 and increased by 69% in the right M1. CONCLUSION: Performance improvement occurred earliest in the dual M1 condition with a stable and lasting effect. Unilateral anodal stimulation resulted only in trendwise improvement when compared to sham. Therefore, task-concurrent dual M1 stimulation is most suited for obtaining the desired neuromodulatory effects of tDCS in explicit motor learning.

  9. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  10. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  11. TASS code topical report. V.1 TASS code technical manual

    International Nuclear Information System (INIS)

    Sim, Suk K.; Chang, W. P.; Kim, K. D.; Kim, H. C.; Yoon, H. Y.

    1997-02-01

    TASS 1.0 code has been developed at KAERI for the initial and reload non-LOCA safety analysis for the operating PWRs as well as the PWRs under construction in Korea. TASS code will replace various vendor's non-LOCA safety analysis codes currently used for the Westinghouse and ABB-CE type PWRs in Korea. This can be achieved through TASS code input modifications specific to each reactor type. The TASS code can be run interactively through the keyboard operation. A simimodular configuration used in developing the TASS code enables the user easily implement new models. TASS code has been programmed using FORTRAN77 which makes it easy to install and port for different computer environments. The TASS code can be utilized for the steady state simulation as well as the non-LOCA transient simulations such as power excursions, reactor coolant pump trips, load rejections, loss of feedwater, steam line breaks, steam generator tube ruptures, rod withdrawal and drop, and anticipated transients without scram (ATWS). The malfunctions of the control systems, components, operator actions and the transients caused by the malfunctions can be easily simulated using the TASS code. This technical report describes the TASS 1.0 code models including reactor thermal hydraulic, reactor core and control models. This TASS code models including reactor thermal hydraulic, reactor core and control models. This TASS code technical manual has been prepared as a part of the TASS code manual which includes TASS code user's manual and TASS code validation report, and will be submitted to the regulatory body as a TASS code topical report for a licensing non-LOCA safety analysis for the Westinghouse and ABB-CE type PWRs operating and under construction in Korea. (author). 42 refs., 29 tabs., 32 figs

  12. 75 FR 20833 - Building Energy Codes

    Science.gov (United States)

    2010-04-21

    ...-0012] Building Energy Codes AGENCY: Office of Energy Efficiency and Renewable Energy, Department of... the current model building energy codes or their equivalent. DOE is interested in better understanding... codes, Standard 90.1-2007, Energy Standard for Buildings Except Low-Rise Residential Buildings (or...

  13. Learning Joint-Sparse Codes for Calibration-Free Parallel MR Imaging.

    Science.gov (United States)

    Wang, Shanshan; Tan, Sha; Gao, Yuan; Liu, Qiegen; Ying, Leslie; Xiao, Taohui; Liu, Yuanyuan; Liu, Xin; Zheng, Hairong; Liang, Dong

    2018-01-01

    The integration of compressed sensing and parallel imaging (CS-PI) has shown an increased popularity in recent years to accelerate magnetic resonance (MR) imaging. Among them, calibration-free techniques have presented encouraging performances due to its capability in robustly handling the sensitivity information. Unfortunately, existing calibration-free methods have only explored joint-sparsity with direct analysis transform projections. To further exploit joint-sparsity and improve reconstruction accuracy, this paper proposes to Learn joINt-sparse coDes for caliBration-free parallEl mR imaGing (LINDBERG) by modeling the parallel MR imaging problem as an - - minimization objective with an norm constraining data fidelity, Frobenius norm enforcing sparse representation error and the mixed norm triggering joint sparsity across multichannels. A corresponding algorithm has been developed to alternatively update the sparse representation, sensitivity encoded images and K-space data. Then, the final image is produced as the square root of sum of squares of all channel images. Experimental results on both physical phantom and in vivo data sets show that the proposed method is comparable and even superior to state-of-the-art CS-PI reconstruction approaches. Specifically, LINDBERG has presented strong capability in suppressing noise and artifacts while reconstructing MR images from highly undersampled multichannel measurements.

  14. Detecting non-coding selective pressure in coding regions

    Directory of Open Access Journals (Sweden)

    Blanchette Mathieu

    2007-02-01

    Full Text Available Abstract Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements.

  15. TEACHERS’ AND STUDENTS’ ATTITUDE TOWARD CODE ALTERNATION IN PAKISTANI ENGLISH CLASSROOMS

    Directory of Open Access Journals (Sweden)

    Aqsa Tahir

    2016-11-01

    Full Text Available This research is an attempt to explore students‟ and teachers‟ attitude towards code alternation within English classrooms in Pakistan. In a country like Pakistan where official language is English, the national language is Urdu, and every province has its own language, most of the people are bilinguals or multilingual. Therefore, the aim of this study was to find out when and why teachers code switch in L2 English classrooms. It has also explored student‟s preferences of language during learning second language. It has also looked into teachers‟ code-switching patterns and the students‟ priorities. Ten teachers responded to an open ended questioner and 100 students responded to a close ended questioner. Results of teacher‟s responses indicated that they mostly code switch when student‟s response in relation to the comprehensibility is negative and they do not grasp the concepts easily in L2. They never encourage students to speak Urdu. Student‟s results showed that they mostly prefer code-switching into their L1 for better understanding and participation in class. Analysis revealed that students only favored English while getting instructions of test, receiving results, and learning grammatical concepts. In most of the cases, students showed flexibility in language usage. Majority of students (68% agreed upon that they learn better when their teachers code switch in to L1.

  16. Machine Learning for RealisticBall Detection in RoboCup SPL

    OpenAIRE

    Bloisi, Domenico; Del Duchetto, Francesco; Manoni, Tiziano; Suriani, Vincenzo

    2017-01-01

    In this technical report, we describe the use of a machine learning approach for detecting the realistic black and white ball currently in use in the RoboCup Standard Platform League. Our aim is to provide a ready-to-use software module that can be useful for the RoboCup SPL community. To this end, the approach is integrated within the official B-Human code release 2016. The complete code for the approach presented in this work can be downloaded from the SPQR Team homepage at http://spqr.diag...

  17. Tri-code inductance control rod position indicator with several multi-coding-bars

    International Nuclear Information System (INIS)

    Shi Jibin; Jiang Yueyuan; Wang Wenran

    2004-01-01

    A control rod position indicator named as tri-code inductance control rod position indicator with multi-coding-bars, which possesses simple structure, reliable operation and high precision, is developed. The detector of the indicator is composed of K coils, a compensatory coil and K coding bars. Each coding bar consists of several sections of strong magnetic cores, several sections of weak magnetic cores and several sections of non-magnetic portions. As the control rod is withdrawn, the coding bars move in the center of the coils respectively, while the constant alternating current passes the coils and makes them to create inductance alternating voltage signals. The outputs of the coils are picked and processed, and the tri-codes indicating rod position can be gotten. Moreover, the coding principle of the detector and its related structure are introduced. The analysis shows that the indicator owns more advantage over the coils-coding rod position indicator, so it can meet the demands of the rod position indicating in nuclear heating reactor (NHR). (authors)

  18. SU-A-210-01: Why Should We Learn Radiation Oncology Billing?

    International Nuclear Information System (INIS)

    Wu, H.

    2015-01-01

    The purpose of this student annual meeting is to address topics that are becoming more relevant to medical physicists, but are not frequently addressed, especially for students and trainees just entering the field. The talk is divided into two parts: medical billing and regulations. Hsinshun Wu – Why should we learn radiation oncology billing? Many medical physicists do not like to be involved with medical billing or coding during their career. They believe billing is not their responsibility and sometimes they even refuse to participate in the billing process if given the chance. This presentation will talk about a physicist’s long career and share his own experience that knowing medical billing is not only important and necessary for every young medical physicist, but that good billing knowledge could provide a valuable contribution to his/her medical physics development. Learning Objectives: The audience will learn the basic definition of Current Procedural Terminology (CPT) codes performed in a Radiation Oncology Department. Understand the differences between hospital coding and physician-based or freestanding coding. Apply proper CPT coding for each Radiation Oncology procedure. Each procedure with its specific CPT code will be discussed in detail. The talk will focus on the process of care and use of actual workflow to understand each CPT code. Example coding of a typical Radiation Oncology procedure. Special procedure coding such as brachytherapy, proton therapy, radiosurgery, and SBRT. Maryann Abogunde – Medical physics opportunities at the Nuclear Regulatory Commission (NRC) The NRC’s responsibilities include the regulation of medical uses of byproduct (radioactive) materials and oversight of medical use end-users (licensees) through a combination of regulatory requirements, licensing, safety oversight including inspection and enforcement, operational experience evaluation, and regulatory support activities. This presentation will explore the

  19. SU-A-210-01: Why Should We Learn Radiation Oncology Billing?

    Energy Technology Data Exchange (ETDEWEB)

    Wu, H. [Willis-Knighton Medical Center (United States)

    2015-06-15

    The purpose of this student annual meeting is to address topics that are becoming more relevant to medical physicists, but are not frequently addressed, especially for students and trainees just entering the field. The talk is divided into two parts: medical billing and regulations. Hsinshun Wu – Why should we learn radiation oncology billing? Many medical physicists do not like to be involved with medical billing or coding during their career. They believe billing is not their responsibility and sometimes they even refuse to participate in the billing process if given the chance. This presentation will talk about a physicist’s long career and share his own experience that knowing medical billing is not only important and necessary for every young medical physicist, but that good billing knowledge could provide a valuable contribution to his/her medical physics development. Learning Objectives: The audience will learn the basic definition of Current Procedural Terminology (CPT) codes performed in a Radiation Oncology Department. Understand the differences between hospital coding and physician-based or freestanding coding. Apply proper CPT coding for each Radiation Oncology procedure. Each procedure with its specific CPT code will be discussed in detail. The talk will focus on the process of care and use of actual workflow to understand each CPT code. Example coding of a typical Radiation Oncology procedure. Special procedure coding such as brachytherapy, proton therapy, radiosurgery, and SBRT. Maryann Abogunde – Medical physics opportunities at the Nuclear Regulatory Commission (NRC) The NRC’s responsibilities include the regulation of medical uses of byproduct (radioactive) materials and oversight of medical use end-users (licensees) through a combination of regulatory requirements, licensing, safety oversight including inspection and enforcement, operational experience evaluation, and regulatory support activities. This presentation will explore the

  20. Task-specific effect of transcranial direct current stimulation on motor learning

    Directory of Open Access Journals (Sweden)

    Cinthia Maria Saucedo Marquez

    2013-07-01

    Full Text Available Transcranial direct current stimulation (tDCS is a relatively new non-invasive brain stimulation technique that modulates neural processes. When applied to the human primary motor cortex (M1, tDCS has beneficial effects on motor skill learning and consolidation in healthy controls and in patients. However, it remains unclear whether tDCS improves motor learning in a general manner or whether these effects depend on which motor task is acquired. Here we compare whether the effect of tDCS differs when the same individual acquires (1 a Sequential Finger Tapping Task (SEQTAP and (2 a Visual Isometric Pinch Force Task (FORCE. Both tasks have been shown to be sensitive to tDCS applied over M1, however, the underlying processes mediating learning and memory formation might benefit differently from anodal-tDCS. Thirty healthy subjects were randomly assigned to an anodal-tDCS group or sham-group. Using a double-blind, sham-controlled cross-over design, tDCS was applied over M1 while subjects acquired each of the motor tasks over 3 consecutive days, with the order being randomized across subjects. We found that anodal-tDCS affected each task differently: The SEQTAP task benefited from anodal-tDCS during learning, whereas the FORCE task showed improvements only at retention. These findings suggest that anodal tDCS applied over M1 appears to have a task-dependent effect on learning and memory formation.

  1. Fast convolutional sparse coding using matrix inversion lemma

    Czech Academy of Sciences Publication Activity Database

    Šorel, Michal; Šroubek, Filip

    2016-01-01

    Roč. 55, č. 1 (2016), s. 44-51 ISSN 1051-2004 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Convolutional sparse coding * Feature learning * Deconvolution networks * Shift-invariant sparse coding Subject RIV: JD - Computer Applications, Robotics Impact factor: 2.337, year: 2016 http://library.utia.cas.cz/separaty/2016/ZOI/sorel-0459332.pdf

  2. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  3. An in-depth study of sparse codes on abnormality detection

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2016-01-01

    Sparse representation has been applied successfully in abnormal event detection, in which the baseline is to learn a dictionary accompanied by sparse codes. While much emphasis is put on discriminative dictionary construction, there are no comparative studies of sparse codes regarding abnormality...... are carried out from various angles to better understand the applicability of sparse codes, including computation time, reconstruction error, sparsity, detection accuracy, and their performance combining various detection methods. The experiment results show that combining OMP codes with maximum coordinate...

  4. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  5. Learn ggplot2 using Shiny App

    CERN Document Server

    Moon, Keon-Woong

    2016-01-01

    This book and app is for practitioners, professionals, researchers, and students who want to learn how to make a plot within the R environment using ggplot2, step-by-step without coding. In widespread use in the statistical communities, R is a free software language and environment for statistical programming and graphics. Many users find R to have a steep learning curve but to be extremely useful once overcome. ggplot2 is an extremely popular package tailored for producing graphics within R but which requires coding and has a steep learning curve itself, and Shiny is an open source R package that provides a web framework for building web applications using R without requiring HTML, CSS, or JavaScript. This manual—"integrating" R, ggplot2, and Shiny—introduces a new Shiny app, Learn ggplot2, that allows users to make plots easily without coding. With the Learn ggplot2 Shiny app, users can make plots using ggplot2 without having to code each step, reducing typos and error messages and allowing users to bec...

  6. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  7. Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.

    Science.gov (United States)

    Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic

    2017-03-01

    Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.

  8. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  9. Lower hybrid current drive in shaped tokamaks

    International Nuclear Information System (INIS)

    Kesner, J.

    1993-01-01

    A time dependent lower hybrid current drive tokamak simulation code has been developed. This code combines the BALDUR tokamak simulation code and the Bonoli/Englade lower hybrid current drive code and permits the study of the interaction of lower hybrid current drive with neutral beam heating in shaped cross-section plasmas. The code is time dependent and includes the beam driven and bootstrap currents in addition to the current driven by the lower hybrid system. Examples of simulations are shown for the PBX-M experiment which include the effect of cross section shaping on current drive, ballooning mode stabilization by current profile control and sawtooth stabilization. A critical question in current drive calculations is the radial transport of the energetic electrons. The authors have developed a response function technique to calculate radial transport in the presence of an electric field. The consequences of the combined influences of radial diffusion and electric field acceleration are discussed

  10. Current status and applications of intergrated safety assessment and simulation code system for ISA

    Energy Technology Data Exchange (ETDEWEB)

    Izquierdo, J. M.; Hortal, J.; Perea, M. Sanchez; Melendez, E. [Modeling and Simulation Area (MOSI), Nuclear Safety Council (CSN), Madrid (Spain); Queral, E.; Rivas-Lewicky, J. [Energy and Fuels Department, Technical University of Madrid (UPM), Madrid (Spain)

    2017-03-15

    This paper reviews current status of the unified approach known as integrated safety assessment (ISA), as well as the associated SCAIS (simulation codes system for ISA) computer platform. These constitute a proposal, which is the result of collaborative action among the Nuclear Safety Council (CSN), University of Madrid (UPM), and NFQ Solutions S.L, aiming to allow independent regulatory verification of industry quantitative risk assessments. The content elaborates on discussions of the classical treatment of time in conventional probabilistic safety assessment (PSA) sequences and states important conclusions that can be used to avoid systematic and unacceptable underestimation of the failure exceedance frequencies. The unified ISA method meets this challenge by coupling deterministic and probabilistic mutual influences. The feasibility of the approach is illustrated with some examples of its application to a real size plant.

  11. Proposing a Web-Based Tutorial System to Teach Malay Language Braille Code to the Sighted

    Science.gov (United States)

    Wah, Lee Lay; Keong, Foo Kok

    2010-01-01

    The "e-KodBrailleBM Tutorial System" is a web-based tutorial system which is specially designed to teach, facilitate and support the learning of Malay Language Braille Code to individuals who are sighted. The targeted group includes special education teachers, pre-service teachers, and parents. Learning Braille code involves memorisation…

  12. Imagery and Verbal Coding Approaches in Chinese Vocabulary Instruction

    Science.gov (United States)

    Shen, Helen H.

    2010-01-01

    This study consists of two instructional experiments. Within the framework of dual coding theory, the study compares the learning effects of two instructional encoding methods used in Chinese vocabulary instruction among students learning beginning Chinese as a foreign language. One method uses verbal encoding only, and the other method uses…

  13. Developing a Code of Practice for Learning Analytics

    Science.gov (United States)

    Sclater, Niall

    2016-01-01

    Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…

  14. The ALI-ARMS Code for Modeling Atmospheric non-LTE Molecular Band Emissions: Current Status and Applications

    Science.gov (United States)

    Kutepov, A. A.; Feofilov, A. G.; Manuilova, R. O.; Yankovsky, V. A.; Rezac, L.; Pesnell, W. D.; Goldberg, R. A.

    2008-01-01

    The Accelerated Lambda Iteration (ALI) technique was developed in stellar astrophysics at the beginning of 1990s for solving the non-LTE radiative transfer problem in atomic lines and multiplets in stellar atmospheres. It was later successfully applied to modeling the non-LTE emissions and radiative cooling/heating in the vibrational-rotational bands of molecules in planetary atmospheres. Similar to the standard lambda iterations ALI operates with the matrices of minimal dimension. However, it provides higher convergence rate and stability due to removing from the iterating process the photons trapped in the optically thick line cores. In the current ALI-ARMS (ALI for Atmospheric Radiation and Molecular Spectra) code version additional acceleration of calculations is provided by utilizing the opacity distribution function (ODF) approach and "decoupling". The former allows replacing the band branches by single lines of special shape, whereas the latter treats non-linearity caused by strong near-resonant vibration-vibrational level coupling without additional linearizing the statistical equilibrium equations. Latest code application for the non-LTE diagnostics of the molecular band emissions of Earth's and Martian atmospheres as well as for the non-LTE IR cooling/heating calculations are discussed.

  15. Active Learning for Directed Exploration of Complex Systems

    Science.gov (United States)

    Burl, Michael C.; Wang, Esther

    2009-01-01

    Physics-based simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. Such codes provide the highest-fidelity representation of system behavior, but are often so slow to run that insight into the system is limited. For example, conducting an exhaustive sweep over a d-dimensional input parameter space with k-steps along each dimension requires k(sup d) simulation trials (translating into k(sup d) CPU-days for one of our current simulations). An alternative is directed exploration in which the next simulation trials are cleverly chosen at each step. Given the results of previous trials, supervised learning techniques (SVM, KDE, GP) are applied to build up simplified predictive models of system behavior. These models are then used within an active learning framework to identify the most valuable trials to run next. Several active learning strategies are examined including a recently-proposed information-theoretic approach. Performance is evaluated on a set of thirteen synthetic oracles, which serve as surrogates for the more expensive simulations and enable the experiments to be replicated by other researchers.

  16. Finite element circuit theory of the numerical code EDDYMULT for solving eddy current problems in a multi-torus system

    International Nuclear Information System (INIS)

    Nakamura, Yukiharu; Ozeki, Takahisa

    1986-07-01

    The finite element circuit theory is extended to the general eddy current problem in a multi-torus system, which consists of various torus conductors and axisymmetric coil systems. The numerical procedures are devised to avoid practical restrictions of computer storage and computing time, that is, the reduction technique of eddy current eigen modes to save storage and the introduction of shape function into the double area integral of mode coupling to save time. The numerical code EDDYMULT based on the theory is developed to use in designing tokamak device from the viewpoints of the evaluation of electromagnetic loading on the device components and the control analysis of tokamak equilibrium. (author)

  17. Transcranial direct current stimulation of the posterior parietal cortex modulates arithmetic learning.

    Science.gov (United States)

    Grabner, Roland H; Rütsche, Bruno; Ruff, Christian C; Hauser, Tobias U

    2015-07-01

    The successful acquisition of arithmetic skills is an essential step in the development of mathematical competencies and has been associated with neural activity in the left posterior parietal cortex (PPC). It is unclear, however, whether this brain region plays a causal role in arithmetic skill acquisition and whether arithmetic learning can be modulated by means of non-invasive brain stimulation of this key region. In the present study we addressed these questions by applying transcranial direct current stimulation (tDCS) over the left PPC during a short-term training that simulates the typical path of arithmetic skill acquisition (specifically the transition from effortful procedural to memory-based problem-solving strategies). Sixty participants received either anodal, cathodal or sham tDCS while practising complex multiplication and subtraction problems. The stability of the stimulation-induced learning effects was assessed in a follow-up test 24 h after the training. Learning progress was modulated by tDCS. Cathodal tDCS (compared with sham) decreased learning rates during training and resulted in poorer performance which lasted over 24 h after stimulation. Anodal tDCS showed an operation-specific improvement for subtraction learning. Our findings extend previous studies by demonstrating that the left PPC is causally involved in arithmetic learning (and not only in arithmetic performance) and that even a short-term tDCS application can modulate the success of arithmetic knowledge acquisition. Moreover, our finding of operation-specific anodal stimulation effects suggests that the enhancing effects of tDCS on learning can selectively affect just one of several cognitive processes mediated by the stimulated area. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  18. Policy Pathways: Modernising Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-01

    Buildings are the largest consumers of energy worldwide and will continue to be a source of increasing energy demand in the future. Globally, the sector’s final energy consumption doubled between 1971 and 2010 to reach 2 794 million tonnes of oil equivalent (Mtoe), driven primarily by population increase and economic growth. Under current policies, the global energy demand of buildings is projected by the IEA experts to grow by an additional 838 Mtoe by 2035 compared to 2010. The challenges of the projected increase of energy consumption due to the built environment vary by country. In IEA member countries, much of the future buildings stock is already in place, and so the main challenge is to renovate existing buildings stock. In non-IEA countries, more than half of the buildings stock needed by 2050 has yet to be built. The IEA and the UNDP partnered to analyse current practices in the design and implementation of building energy codes. The aim is to consolidate existing efforts and to encourage more attention to the role of the built environment in a low-carbon and climate-resilient world. This joint IEA-UNDP Policy Pathway aims to share lessons learned between IEA member countries and non-IEA countries. The objective is to spread best practices, limit pressures on global energy supply, improve energy security, and contribute to environmental sustainability. Part of the IEA Policy Pathway series, Modernising building energy codes to secure our global energy future sets out key steps in planning, implementation, monitoring and evaluation. The Policy Pathway series aims to help policy makers implement the IEA 25 Energy Efficiency Policy Recommendations endorsed by IEA Ministers (2011).

  19. Some aspects of grading Java code submissions in MOOCs

    Directory of Open Access Journals (Sweden)

    Sándor Király

    2017-07-01

    Full Text Available Recently, massive open online courses (MOOCs have been offering a new online approach in the field of distance learning and online education. A typical MOOC course consists of video lectures, reading material and easily accessible tests for students. For a computer programming course, it is important to provide interactive, dynamic, online coding exercises and more complex programming assignments for learners. It is expedient for the students to receive prompt feedback on their coding submissions. Although MOOC automated programme evaluation subsystem is capable of assessing source programme files that are in learning management systems, in MOOC systems there is a grader that is responsible for evaluating students’ assignments with the result that course staff would be required to assess thousands of programmes submitted by the participants of the course without the benefit of an automatic grader. This paper presents a new concept for grading programming submissions of students and improved techniques based on the Java unit testing framework that enables automatic grading of code chunks. Some examples are also given such as the creation of unique exercises by dynamically generating the parameters of the assignment in a MOOC programming course combined with the kind of coding style recognition to teach coding standards.

  20. Overview of current RFSP-code capabilities for CANDU core analysis

    International Nuclear Information System (INIS)

    Rouben, B.

    1996-01-01

    RFSP (Reactor Fuelling Simulation Program) is the major finite-reactor computer code in use at the Atomic Energy of Canada Limited for the design and analysis of CANDU reactor cores. An overview is given of the major computational capabilities available in RFSP. (author) 11 refs., 29 figs

  1. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  2. Differentiation of ileostomy from colostomy procedures: assessing the accuracy of current procedural terminology codes and the utility of natural language processing.

    Science.gov (United States)

    Vo, Elaine; Davila, Jessica A; Hou, Jason; Hodge, Krystle; Li, Linda T; Suliburk, James W; Kao, Lillian S; Berger, David H; Liang, Mike K

    2013-08-01

    Large databases provide a wealth of information for researchers, but identifying patient cohorts often relies on the use of current procedural terminology (CPT) codes. In particular, studies of stoma surgery have been limited by the accuracy of CPT codes in identifying and differentiating ileostomy procedures from colostomy procedures. It is important to make this distinction because the prevalence of complications associated with stoma formation and reversal differ dramatically between types of stoma. Natural language processing (NLP) is a process that allows text-based searching. The Automated Retrieval Console is an NLP-based software that allows investigators to design and perform NLP-assisted document classification. In this study, we evaluated the role of CPT codes and NLP in differentiating ileostomy from colostomy procedures. Using CPT codes, we conducted a retrospective study that identified all patients undergoing a stoma-related procedure at a single institution between January 2005 and December 2011. All operative reports during this time were reviewed manually to abstract the following variables: formation or reversal and ileostomy or colostomy. Sensitivity and specificity for validation of the CPT codes against the mastery surgery schedule were calculated. Operative reports were evaluated by use of NLP to differentiate ileostomy- from colostomy-related procedures. Sensitivity and specificity for identifying patients with ileostomy or colostomy procedures were calculated for CPT codes and NLP for the entire cohort. CPT codes performed well in identifying stoma procedures (sensitivity 87.4%, specificity 97.5%). A total of 664 stoma procedures were identified by CPT codes between 2005 and 2011. The CPT codes were adequate in identifying stoma formation (sensitivity 97.7%, specificity 72.4%) and stoma reversal (sensitivity 74.1%, specificity 98.7%), but they were inadequate in identifying ileostomy (sensitivity 35.0%, specificity 88.1%) and colostomy (75

  3. Classification of multispectral or hyperspectral satellite imagery using clustering of sparse approximations on sparse representations in learned dictionaries obtained using efficient convolutional sparse coding

    Science.gov (United States)

    Moody, Daniela; Wohlberg, Brendt

    2018-01-02

    An approach for land cover classification, seasonal and yearly change detection and monitoring, and identification of changes in man-made features may use a clustering of sparse approximations (CoSA) on sparse representations in learned dictionaries. The learned dictionaries may be derived using efficient convolutional sparse coding to build multispectral or hyperspectral, multiresolution dictionaries that are adapted to regional satellite image data. Sparse image representations of images over the learned dictionaries may be used to perform unsupervised k-means clustering into land cover categories. The clustering process behaves as a classifier in detecting real variability. This approach may combine spectral and spatial textural characteristics to detect geologic, vegetative, hydrologic, and man-made features, as well as changes in these features over time.

  4. Color Coding of Circuit Quantities in Introductory Circuit Analysis Instruction

    Science.gov (United States)

    Reisslein, Jana; Johnson, Amy M.; Reisslein, Martin

    2015-01-01

    Learning the analysis of electrical circuits represented by circuit diagrams is often challenging for novice students. An open research question in electrical circuit analysis instruction is whether color coding of the mathematical symbols (variables) that denote electrical quantities can improve circuit analysis learning. The present study…

  5. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files

  6. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files.

  7. Neural mechanisms of reinforcement learning in unmedicated patients with major depressive disorder.

    Science.gov (United States)

    Rothkirch, Marcus; Tonn, Jonas; Köhler, Stephan; Sterzer, Philipp

    2017-04-01

    According to current concepts, major depressive disorder is strongly related to dysfunctional neural processing of motivational information, entailing impairments in reinforcement learning. While computational modelling can reveal the precise nature of neural learning signals, it has not been used to study learning-related neural dysfunctions in unmedicated patients with major depressive disorder so far. We thus aimed at comparing the neural coding of reward and punishment prediction errors, representing indicators of neural learning-related processes, between unmedicated patients with major depressive disorder and healthy participants. To this end, a group of unmedicated patients with major depressive disorder (n = 28) and a group of age- and sex-matched healthy control participants (n = 30) completed an instrumental learning task involving monetary gains and losses during functional magnetic resonance imaging. The two groups did not differ in their learning performance. Patients and control participants showed the same level of prediction error-related activity in the ventral striatum and the anterior insula. In contrast, neural coding of reward prediction errors in the medial orbitofrontal cortex was reduced in patients. Moreover, neural reward prediction error signals in the medial orbitofrontal cortex and ventral striatum showed negative correlations with anhedonia severity. Using a standard instrumental learning paradigm we found no evidence for an overall impairment of reinforcement learning in medication-free patients with major depressive disorder. Importantly, however, the attenuated neural coding of reward in the medial orbitofrontal cortex and the relation between anhedonia and reduced reward prediction error-signalling in the medial orbitofrontal cortex and ventral striatum likely reflect an impairment in experiencing pleasure from rewarding events as a key mechanism of anhedonia in major depressive disorder. © The Author (2017). Published by Oxford

  8. Applications guide to the RSIC-distributed version of the MCNP code (coupled Monte Carlo neutron-photon Code)

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1985-09-01

    An overview of the RSIC-distributed version of the MCNP code (a soupled Monte Carlo neutron-photon code) is presented. All general features of the code, from machine hardware requirements to theoretical details, are discussed. The current nuclide cross-section and other libraries available in the standard code package are specified, and a realistic example of the flexible geometry input is given. Standard and nonstandard source, estimator, and variance-reduction procedures are outlined. Examples of correct usage and possible misuse of certain code features are presented graphically and in standard output listings. Finally, itemized summaries of sample problems, various MCNP code documentation, and future work are given

  9. A review of high beam current RFQ accelerators and funnels

    International Nuclear Information System (INIS)

    Schneider, J.D.

    1998-01-01

    The authors review the design features of several high-current (> 20-mA) and high-power (> 1-mA average) proton or H - injectors, RFQs, and funnels. They include a summary of observed performance and will mention a sampling of new designs, including the proposed incorporation of beam choppers. Different programs and organizations have chosen to build the RFQ in diverse configurations. Although the majority of RFQs are either low-current or very low duty-factor, several versions have included high-current and/or high-power designs for either protons or H - ions. The challenges of cooling, handling high space-charge forces, and coupling with injectors and subsequent accelerators are significant. In all instances, beam tests were a valuable learning experience, because not always did these as-built structures perform exactly as predicted by the earlier design codes. They summarize the key operational parameters, indicate what was achieved, and highlight what was learned in these tests. Based on this generally good performance and high promise, even more challenging designs are being considered for new applications that include even higher powers, beam funnels and choppers

  10. Software Quality and Security in Teachers' and Students' Codes When Learning a New Programming Language

    Science.gov (United States)

    Boutnaru, Shlomi; Hershkovitz, Arnon

    2015-01-01

    In recent years, schools (as well as universities) have added cyber security to their computer science curricula. This topic is still new for most of the current teachers, who would normally have a standard computer science background. Therefore the teachers are trained and then teaching their students what they have just learned. In order to…

  11. Survey of coded aperture imaging

    International Nuclear Information System (INIS)

    Barrett, H.H.

    1975-01-01

    The basic principle and limitations of coded aperture imaging for x-ray and gamma cameras are discussed. Current trends include (1) use of time varying apertures, (2) use of ''dilute'' apertures with transmission much less than 50%, and (3) attempts to derive transverse tomographic sections, unblurred by other planes, from coded images

  12. IFR code for secondary particle dynamics

    International Nuclear Information System (INIS)

    Teague, M.R.; Yu, S.S.

    1985-01-01

    A numerical simulation has been constructed to obtain a detailed, quantitative estimate of the electromagnetic fields and currents existing in the Advanced Test Accelerator under conditions of laser guiding. The code treats the secondary electrons by particle simulation and the beam dynamics by a time-dependent envelope model. The simulation gives a fully relativistic description of secondary electrons moving in self-consistent electromagnetic fields. The calculations are made using coordinates t, x, y, z for the electrons and t, ct-z, r for the axisymmetric electromagnetic fields and currents. Code results, showing in particular current enhancement effects, will be given

  13. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-12-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  14. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-04-11

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaickingand 4D light field view synthesis.

  15. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup; Swanson, Robin; Heide, Felix; Wetzstein, Gordon; Heidrich, Wolfgang

    2017-01-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  16. Verification of aero-elastic offshore wind turbine design codes under IEA Wind Task XXIII

    DEFF Research Database (Denmark)

    Vorpahl, Fabian; Strobel, Michael; Jonkman, Jason M.

    2014-01-01

    with the incident waves, sea current, hydrodynamics and foundation dynamics of the support structure. A large set of time series simulation results such as turbine operational characteristics, external conditions, and load and displacement outputs was compared and interpreted. Load cases were defined and run...... to differences in the model fidelity, aerodynamic implementation, hydrodynamic load discretization and numerical difficulties within the codes. The comparisons resulted in a more thorough understanding of the modeling techniques and better knowledge of when various approximations are not valid.More importantly...... is to summarize the lessons learned and present results that code developers can compare to. The set of benchmark load cases defined and simulated during the course of this project—the raw data for this paper—is available to the offshore wind turbine simulation community and is already being used for testing...

  17. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  18. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2015-01-01

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... solutions. Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate...... their performance from various aspects to better understand their applicability, including computation time, reconstruction error, sparsity, detection...

  19. Development of 2D particle-in-cell code to simulate high current, low ...

    Indian Academy of Sciences (India)

    Abstract. A code for 2D space-charge dominated beam dynamics study in beam trans- port lines is developed. The code is used for particle-in-cell (PIC) simulation of z-uniform beam in a channel containing solenoids and drift space. It can also simulate a transport line where quadrupoles are used for focusing the beam.

  20. Software and the Scientist: Coding and Citation Practices in Geodynamics

    Science.gov (United States)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  1. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint

    Directory of Open Access Journals (Sweden)

    Zhi Gao

    2018-05-01

    Full Text Available Light detection and ranging (LiDAR sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs and unmanned aerial vehicles (UAVs to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.

  2. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint.

    Science.gov (United States)

    Gao, Zhi; Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Ramesh, Bharath; Zhai, Ruifang

    2018-05-06

    Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.

  3. Breathing (and Coding?) a Bit Easier: Changes to International Classification of Disease Coding for Pulmonary Hypertension.

    Science.gov (United States)

    Mathai, Stephen C; Mathew, Sherin

    2018-04-20

    International Classification of Disease (ICD) coding system is broadly utilized by healthcare providers, hospitals, healthcare payers, and governments to track health trends and statistics at the global, national, and local levels and to provide a reimbursement framework for medical care based upon diagnosis and severity of illness. The current iteration of the ICD system, ICD-10, was implemented in 2015. While many changes to the prior ICD-9 system were included in the ICD-10 system, the newer revision failed to adequately reflect advances in the clinical classification of certain diseases such as pulmonary hypertension (PH). Recently, a proposal to modify the ICD-10 codes for PH was considered and ultimately adopted for inclusion as updates to ICD-10 coding system. While these revisions better reflect the current clinical classification of PH, in the future, further changes should be considered to improve the accuracy and ease of coding for all forms of PH. Copyright © 2018. Published by Elsevier Inc.

  4. Chromosome preference of disease genes and vectorization for the prediction of non-coding disease genes.

    Science.gov (United States)

    Peng, Hui; Lan, Chaowang; Liu, Yuansheng; Liu, Tao; Blumenstein, Michael; Li, Jinyan

    2017-10-03

    Disease-related protein-coding genes have been widely studied, but disease-related non-coding genes remain largely unknown. This work introduces a new vector to represent diseases, and applies the newly vectorized data for a positive-unlabeled learning algorithm to predict and rank disease-related long non-coding RNA (lncRNA) genes. This novel vector representation for diseases consists of two sub-vectors, one is composed of 45 elements, characterizing the information entropies of the disease genes distribution over 45 chromosome substructures. This idea is supported by our observation that some substructures (e.g., the chromosome 6 p-arm) are highly preferred by disease-related protein coding genes, while some (e.g., the 21 p-arm) are not favored at all. The second sub-vector is 30-dimensional, characterizing the distribution of disease gene enriched KEGG pathways in comparison with our manually created pathway groups. The second sub-vector complements with the first one to differentiate between various diseases. Our prediction method outperforms the state-of-the-art methods on benchmark datasets for prioritizing disease related lncRNA genes. The method also works well when only the sequence information of an lncRNA gene is known, or even when a given disease has no currently recognized long non-coding genes.

  5. Modulation of motor performance and motor learning by transcranial direct current stimulation.

    Science.gov (United States)

    Reis, Janine; Fritsch, Brita

    2011-12-01

    Transcranial direct current stimulation (tDCS) has shown preliminary success in improving motor performance and motor learning in healthy individuals, and restitution of motor deficits in stroke patients. This brief review highlights some recent work. Within the past years, behavioural studies have confirmed and specified the timing and polarity specific effects of tDCS on motor skill learning and motor adaptation. There is strong evidence that timely co-application of (hand/arm) training and anodal tDCS to the contralateral M1 can improve motor learning. Improvements in motor function as measured by clinical scores have been described for combined tDCS and training in stroke patients. For this purpose, electrode montages have been modified with respect to interhemispheric imbalance after brain injury. Cathodal tDCS applied to the unlesioned M1 or bihemispheric M1 stimulation appears to be well tolerated and useful to induce improvements in motor function. Mechanistic studies in humans and animals are discussed with regard to physiological motor learning. tDCS is well tolerated, easy to use and capable of inducing lasting improvements in motor function. This method holds promise for the rehabilitation of motor disabilities, although acute studies in patients with brain injury are so far lacking.

  6. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu; Ghanem, Bernard; Liu, Si; Xu, Changsheng; Ahuja, Narendra

    2013-01-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  7. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu

    2013-12-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  8. E-LEARNING: CURRENT STATE, TRENDS AND FUTURE PROSPECTS

    Directory of Open Access Journals (Sweden)

    Г А Краснова

    2017-12-01

    Full Text Available The article is devoted to the main trends of development of e-learning in formal and non-formal education in different countries. The article discusses the main quantitative and qualitative characteristics of the market of e-learning education. The authors define main reasons the development of e-learning education in higher education. The authors note that the demand for e-learning by various groups of users will push the education authorities and educational institutions to develop different forms of e-learning and implement new business models of universities. In most universities in Europe and the United States adopted or will be adopted for the institutional strategy of development of e-learning.

  9. The development of automaticity in short-term memory search: Item-response learning and category learning.

    Science.gov (United States)

    Cao, Rui; Nosofsky, Robert M; Shiffrin, Richard M

    2017-05-01

    In short-term-memory (STM)-search tasks, observers judge whether a test probe was present in a short list of study items. Here we investigated the long-term learning mechanisms that lead to the highly efficient STM-search performance observed under conditions of consistent-mapping (CM) training, in which targets and foils never switch roles across trials. In item-response learning, subjects learn long-term mappings between individual items and target versus foil responses. In category learning, subjects learn high-level codes corresponding to separate sets of items and learn to attach old versus new responses to these category codes. To distinguish between these 2 forms of learning, we tested subjects in categorized varied mapping (CV) conditions: There were 2 distinct categories of items, but the assignment of categories to target versus foil responses varied across trials. In cases involving arbitrary categories, CV performance closely resembled standard varied-mapping performance without categories and departed dramatically from CM performance, supporting the item-response-learning hypothesis. In cases involving prelearned categories, CV performance resembled CM performance, as long as there was sufficient practice or steps taken to reduce trial-to-trial category-switching costs. This pattern of results supports the category-coding hypothesis for sufficiently well-learned categories. Thus, item-response learning occurs rapidly and is used early in CM training; category learning is much slower but is eventually adopted and is used to increase the efficiency of search beyond that available from item-response learning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Status of SPACE Safety Analysis Code Development

    International Nuclear Information System (INIS)

    Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2009-01-01

    In 2006, the Korean the Korean nuclear industry started developing a thermal-hydraulic analysis code for safety analysis of PWR(Pressurized Water Reactor). The new code is named as SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). The SPACE code can solve two-fluid, three-field governing equations in one dimensional or three dimensional geometry. The SPACE code has many component models required for modeling a PWR, such as reactor coolant pump, safety injection tank, etc. The programming language used in the new code is C++, for new generation of engineers who are more comfortable with C/C++ than old FORTRAN language. This paper describes general characteristics of SPACE code and current status of SPACE code development

  11. Role of Symbolic Coding and Rehearsal Processes in Observational Learning

    Science.gov (United States)

    Bandura, Albert; Jeffery, Robert W.

    1973-01-01

    Results were interpreted supporting a social learning view of observational learning that emphasizes contral processing of response information in the acquisition phase and motor reproduction and incentive processes in the overt enactment of what has been learned. (Author)

  12. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  13. Does transcranial direct current stimulation affect the learning of a fine sequential hand motor skill with motor imagery?

    NARCIS (Netherlands)

    Sobierajewicz, Jagna; Jaskowski, Wojciech; van der Lubbe, Robert Henricus Johannes

    2017-01-01

    Learning a fine sequential hand motor skill, comparable to playing the piano or learning to type, improves not only due to physical practice, but also due to motor imagery. Previous studies revealed that transcranial direct current stimulation (tDCS) and motor imagery independently affect motor

  14. Navigation towards a goal position: from reactive to generalised learned control

    Energy Technology Data Exchange (ETDEWEB)

    Freire da Silva, Valdinei [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil); Selvatici, Antonio Henrique [Universidade Nove de Julho, Rua Vergueiro, 235, Sao Paulo (Brazil); Reali Costa, Anna Helena, E-mail: valdinei.freire@gmail.com, E-mail: antoniohps@uninove.br, E-mail: anna.reali@poli.usp.br [Laboratorio de Tecnicas Inteligentes - LTI, Escola Politecnica da Universidade de Sao Paulo, Av. Prof. Luciano Gualberto, trav.3, n.158, Cidade Universitaria Sao Paulo (Brazil)

    2011-03-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  15. Navigation towards a goal position: from reactive to generalised learned control

    International Nuclear Information System (INIS)

    Freire da Silva, Valdinei; Selvatici, Antonio Henrique; Reali Costa, Anna Helena

    2011-01-01

    The task of navigating to a target position in space is a fairly common task for a mobile robot. It is desirable that this task is performed even in previously unknown environments. One reactive architecture explored before addresses this challenge by denning a hand-coded coordination of primitive behaviours, encoded by the Potential Fields method. Our first approach to improve the performance of this architecture adds a learning step to autonomously find the best way to coordinate primitive behaviours with respect to an arbitrary performance criterion. Because of the limitations presented by the Potential Fields method, especially in relation to non-convex obstacles, we are investigating the use of Relational Reinforcement Learning as a method to not only learn to act in the current environment, but also to generalise prior knowledge to the current environment in order to achieve the goal more quickly in a non-convex structured environment. We show the results of our previous efforts in reaching goal positions along with our current research on generalised approaches.

  16. Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design

    Directory of Open Access Journals (Sweden)

    Fabien eLotte

    2013-09-01

    Full Text Available While recent research on Brain-Computer Interfaces (BCI has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable EEG patterns (spontaneous BCI control being widely acknowledged as a skill while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years.In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently.

  17. Non-Protein Coding RNAs

    CERN Document Server

    Walter, Nils G; Batey, Robert T

    2009-01-01

    This book assembles chapters from experts in the Biophysics of RNA to provide a broadly accessible snapshot of the current status of this rapidly expanding field. The 2006 Nobel Prize in Physiology or Medicine was awarded to the discoverers of RNA interference, highlighting just one example of a large number of non-protein coding RNAs. Because non-protein coding RNAs outnumber protein coding genes in mammals and other higher eukaryotes, it is now thought that the complexity of organisms is correlated with the fraction of their genome that encodes non-protein coding RNAs. Essential biological processes as diverse as cell differentiation, suppression of infecting viruses and parasitic transposons, higher-level organization of eukaryotic chromosomes, and gene expression itself are found to largely be directed by non-protein coding RNAs. The biophysical study of these RNAs employs X-ray crystallography, NMR, ensemble and single molecule fluorescence spectroscopy, optical tweezers, cryo-electron microscopy, and ot...

  18. Deep learning with Python

    CERN Document Server

    Chollet, Francois

    2018-01-01

    DESCRIPTION Deep learning is applicable to a widening range of artificial intelligence problems, such as image classification, speech recognition, text classification, question answering, text-to-speech, and optical character recognition. Deep Learning with Python is structured around a series of practical code examples that illustrate each new concept introduced and demonstrate best practices. By the time you reach the end of this book, you will have become a Keras expert and will be able to apply deep learning in your own projects. KEY FEATURES • Practical code examples • In-depth introduction to Keras • Teaches the difference between Deep Learning and AI ABOUT THE TECHNOLOGY Deep learning is the technology behind photo tagging systems at Facebook and Google, self-driving cars, speech recognition systems on your smartphone, and much more. AUTHOR BIO Francois Chollet is the author of Keras, one of the most widely used libraries for deep learning in Python. He has been working with deep neural ...

  19. A System for English Vocabulary Acquisition Based on Code-Switching

    Science.gov (United States)

    Mazur, Michal; Karolczak, Krzysztof; Rzepka, Rafal; Araki, Kenji

    2016-01-01

    Vocabulary plays an important part in second language learning and there are many existing techniques to facilitate word acquisition. One of these methods is code-switching, or mixing the vocabulary of two languages in one sentence. In this paper the authors propose an experimental system for computer-assisted English vocabulary learning in…

  20. Variational full wave calculation of fast wave current drive in DIII-D using the ALCYON code

    International Nuclear Information System (INIS)

    Becoulet, A.; Moreau, D.

    1992-04-01

    Initial fast wave current drive simulations performed with the ALCYON code for the 60 MHz DIII-D experiment are presented. Two typical shots of the 1991 summer campaign were selected with magnetic field intensities of 1 and 2 teslas respectively. The results for the wave electromagnetic field in the plasma chamber are displayed. They exhibit a strong enrichment of the poloidal mode number m-spectrum which leads to the upshift of the parallel wavenumber, κ perpendicular, and to the wave absorption. The m-spectrum is bounded when the local poloidal wavenumber reaches the Alfven wavenumber and the κ perpendicular upshifts do not destroy the wave directionality. Linear estimations of the driven current are made. The current density profiles are found to be peaked and we find that about 88 kA can be driven in the 1 tesla/1.7 keV phase with 1.7 MW coupled to the electrons. In the 2 tesla/3.4 keV case, 47 kA are driven with a total power of 1.5 MW, 44% of which are absorbed on the hydrogen minority, through the second harmonic ion cyclotron resonance. The global efficiency is then 0.18 x 10 19 A m -2 W -1 if one considers only the effective power going to the electrons

  1. Learning from a provisioning site: code of conduct compliance and behaviour of whale sharks in Oslob, Cebu, Philippines.

    Science.gov (United States)

    Schleimer, Anna; Araujo, Gonzalo; Penketh, Luke; Heath, Anna; McCoy, Emer; Labaja, Jessica; Lucey, Anna; Ponzo, Alessandro

    2015-01-01

    While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus) at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m) increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take preventive actions even

  2. Research progress on the roles of microRNAs in governing synaptic plasticity, learning and memory.

    Science.gov (United States)

    Wei, Chang-Wei; Luo, Ting; Zou, Shan-Shan; Wu, An-Shi

    2017-11-01

    The importance of non-coding RNA involved in biological processes has become apparent in recent years and the mechanism of transcriptional regulation has also been identified. MicroRNAs (miRNAs) represent a class of small regulatory non-coding RNAs of 22bp in length that mediate gene silencing by identifying specific sequences in the target messenger RNAs (mRNAs). Many miRNAs are highly expressed in the central nervous system in a spatially and temporally controlled manner in normal physiology, as well as in certain pathological conditions. There is growing evidence that a considerable number of specific miRNAs play important roles in synaptic plasticity, learning and memory function. In addition, the dysfunction of these molecules may also contribute to the etiology of several neurodegenerative diseases. Here we provide an overview of the current literatures, which support non-coding RNA-mediated gene function regulation represents an important but underappreciated, layer of epigenetic control that facilitates learning and memory functions. Copyright © 2017. Published by Elsevier Inc.

  3. Current Status and Prospects for E-learning in the Promotion of Distance Education in Bangladesh

    Directory of Open Access Journals (Sweden)

    Abu Sadeque Md. SELIM

    2006-01-01

    Full Text Available The issue of e-learning as an advanced system for training and educating mass people using information and communication technologies (ICTs has been received an increasing level of interest in recent years in most of the western countries. In spite of socio-economic constraints, ICTs are rapidly expanding in the developing countries, and thus offering a new scope for the use of e-learning for the promotion of distance education. In Bangladesh, e-learning was first introduced as early as 1960s as a Radiobroadcast followed by a pilot project School Broadcasting Program (SBP in 1980s and then expanded by the establishment of the National Institute of Educational Media and Technology (NIEMT, which was later transformed into Bangladesh Institute of Distance Education (BIDE in 1985. The significant progress has been done after the establishment of the Bangladesh Open University (BOU in 1992 as the first and only national distance learning university. Within a decade of its establishment, enrollment of BOU students have reached nearly 400 thousands, and thus enlisted it as one of the mega-universities. BOU has been offering a variety of formal and non-formal academic programs from certificate to Masters levels using print, TV and radio broadcasts, audio-cassettes and face to face tutorials as the media of delivering its academic courses. Considering the rapid expansion of computer and internet in Bangladesh after 1998s, it is now appropriate time to consider inclusion of some interactive ICTs i.e. e-learning in delivering course materials of BOU or other institutes to promote distance education in Bangladesh. In this paper, we discuss the current situation and future prospects for e-learning in Bangladesh considering the current trend of ICTs expansion in the country.

  4. On Coding Non-Contiguous Letter Combinations

    Directory of Open Access Journals (Sweden)

    Frédéric eDandurand

    2011-06-01

    Full Text Available Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity.

  5. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  6. Current and future multimodal learning analytics data challenges

    DEFF Research Database (Denmark)

    Spikol, Daniel; Prieto, Luis P.; Rodriguez-Triana, M.J.

    2017-01-01

    Multimodal Learning Analytics (MMLA) captures, integrates and analyzes learning traces from different sources in order to obtain a more holistic understanding of the learning process, wherever it happens. MMLA leverages the increasingly widespread availability of diverse sensors, high......-frequency data collection technologies and sophisticated machine learning and artificial intelligence techniques. The aim of this workshop is twofold: first, to expose participants to, and develop, different multimodal datasets that reflect how MMLA can bring new insights and opportunities to investigate complex...... learning processes and environments; second, to collaboratively identify a set of grand challenges for further MMLA research, built upon the foundations of previous workshops on the topic....

  7. Advancing Kohlberg through Codes: Using Professional Codes To Reach the Moral Reasoning Objective in Undergraduate Ethics Courses.

    Science.gov (United States)

    Whitehouse, Ginny; Ingram, Michael T.

    The development of moral reasoning as a key course objective in undergraduate communication ethics classes can be accomplished by the critical and deliberate introduction of professional codes of ethics and the internalization of values found in those codes. Notably, "fostering moral reasoning skills" and "surveying current ethical…

  8. NESTLE: A nodal kinetics code

    International Nuclear Information System (INIS)

    Al-Chalabi, R.M.; Turinsky, P.J.; Faure, F.-X.; Sarsour, H.N.; Engrand, P.R.

    1993-01-01

    The NESTLE nodal kinetics code has been developed for utilization as a stand-alone code for steady-state and transient reactor neutronic analysis and for incorporation into system transient codes, such as TRAC and RELAP. The latter is desirable to increase the simulation fidelity over that obtained from currently employed zero- and one-dimensional neutronic models and now feasible due to advances in computer performance and efficiency of nodal methods. As a stand-alone code, requirements are that it operate on a range of computing platforms from memory-limited personal computers (PCs) to supercomputers with vector processors. This paper summarizes the features of NESTLE that reflect the utilization and requirements just noted

  9. Model-Driven Engineering of Machine Executable Code

    Science.gov (United States)

    Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira

    Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.

  10. Supervised dictionary learning for inferring concurrent brain networks.

    Science.gov (United States)

    Zhao, Shijie; Han, Junwei; Lv, Jinglei; Jiang, Xi; Hu, Xintao; Zhao, Yu; Ge, Bao; Guo, Lei; Liu, Tianming

    2015-10-01

    Task-based fMRI (tfMRI) has been widely used to explore functional brain networks via predefined stimulus paradigm in the fMRI scan. Traditionally, the general linear model (GLM) has been a dominant approach to detect task-evoked networks. However, GLM focuses on task-evoked or event-evoked brain responses and possibly ignores the intrinsic brain functions. In comparison, dictionary learning and sparse coding methods have attracted much attention recently, and these methods have shown the promise of automatically and systematically decomposing fMRI signals into meaningful task-evoked and intrinsic concurrent networks. Nevertheless, two notable limitations of current data-driven dictionary learning method are that the prior knowledge of task paradigm is not sufficiently utilized and that the establishment of correspondences among dictionary atoms in different brains have been challenging. In this paper, we propose a novel supervised dictionary learning and sparse coding method for inferring functional networks from tfMRI data, which takes both of the advantages of model-driven method and data-driven method. The basic idea is to fix the task stimulus curves as predefined model-driven dictionary atoms and only optimize the other portion of data-driven dictionary atoms. Application of this novel methodology on the publicly available human connectome project (HCP) tfMRI datasets has achieved promising results.

  11. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  12. Secret Codes, Remainder Arithmetic, and Matrices.

    Science.gov (United States)

    Peck, Lyman C.

    This pamphlet is designed for use as enrichment material for able junior and senior high school students who are interested in mathematics. No more than a clear understanding of basic arithmetic is expected. Students are introduced to ideas from number theory and modern algebra by learning mathematical ways of coding and decoding secret messages.…

  13. Assessment of systems codes and their coupling with CFD codes in thermal–hydraulic applications to innovative reactors

    Energy Technology Data Exchange (ETDEWEB)

    Bandini, G., E-mail: giacomino.bandini@enea.it [Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) (Italy); Polidori, M. [Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) (Italy); Gerschenfeld, A.; Pialla, D.; Li, S. [Commissariat à l’Energie Atomique (CEA) (France); Ma, W.M.; Kudinov, P.; Jeltsov, M.; Kööp, K. [Royal Institute of Technology (KTH) (Sweden); Huber, K.; Cheng, X.; Bruzzese, C.; Class, A.G.; Prill, D.P. [Karlsruhe Institute of Technology (KIT) (Germany); Papukchiev, A. [Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) (Germany); Geffray, C.; Macian-Juan, R. [Technische Universität München (TUM) (Germany); Maas, L. [Institut de Radioprotection et de Sûreté Nucléaire (IRSN) (France)

    2015-01-15

    Highlights: • The assessment of RELAP5, TRACE and CATHARE system codes on integral experiments is presented. • Code benchmark of CATHARE, DYN2B, and ATHLET on PHENIX natural circulation experiment. • Grid-free pool modelling based on proper orthogonal decomposition for system codes is explained. • The code coupling methodologies are explained. • The coupling of several CFD/system codes is tested against integral experiments. - Abstract: The THINS project of the 7th Framework EU Program on nuclear fission safety is devoted to the investigation of crosscutting thermal–hydraulic issues for innovative nuclear systems. A significant effort in the project has been dedicated to the qualification and validation of system codes currently employed in thermal–hydraulic transient analysis for nuclear reactors. This assessment is based either on already available experimental data, or on the data provided by test campaigns carried out in the frame of THINS project activities. Data provided by TALL and CIRCE facilities were used in the assessment of system codes for HLM reactors, while the PHENIX ultimate natural circulation test was used as reference for a benchmark exercise among system codes for sodium-cooled reactor applications. In addition, a promising grid-free pool model based on proper orthogonal decomposition is proposed to overcome the limits shown by the thermal–hydraulic system codes in the simulation of pool-type systems. Furthermore, multi-scale system-CFD solutions have been developed and validated for innovative nuclear system applications. For this purpose, data from the PHENIX experiments have been used, and data are provided by the tests conducted with new configuration of the TALL-3D facility, which accommodates a 3D test section within the primary circuit. The TALL-3D measurements are currently used for the validation of the coupling between system and CFD codes.

  14. Deciphering the genetic regulatory code using an inverse error control coding framework.

    Energy Technology Data Exchange (ETDEWEB)

    Rintoul, Mark Daniel; May, Elebeoba Eni; Brown, William Michael; Johnston, Anna Marie; Watson, Jean-Paul

    2005-03-01

    We have found that developing a computational framework for reconstructing error control codes for engineered data and ultimately for deciphering genetic regulatory coding sequences is a challenging and uncharted area that will require advances in computational technology for exact solutions. Although exact solutions are desired, computational approaches that yield plausible solutions would be considered sufficient as a proof of concept to the feasibility of reverse engineering error control codes and the possibility of developing a quantitative model for understanding and engineering genetic regulation. Such evidence would help move the idea of reconstructing error control codes for engineered and biological systems from the high risk high payoff realm into the highly probable high payoff domain. Additionally this work will impact biological sensor development and the ability to model and ultimately develop defense mechanisms against bioagents that can be engineered to cause catastrophic damage. Understanding how biological organisms are able to communicate their genetic message efficiently in the presence of noise can improve our current communication protocols, a continuing research interest. Towards this end, project goals include: (1) Develop parameter estimation methods for n for block codes and for n, k, and m for convolutional codes. Use methods to determine error control (EC) code parameters for gene regulatory sequence. (2) Develop an evolutionary computing computational framework for near-optimal solutions to the algebraic code reconstruction problem. Method will be tested on engineered and biological sequences.

  15. Building Standards and Codes for Energy Conservation

    Science.gov (United States)

    Gross, James G.; Pierlert, James H.

    1977-01-01

    Current activity intended to lead to energy conservation measures in building codes and standards is reviewed by members of the Office of Building Standards and Codes Services of the National Bureau of Standards. For journal availability see HE 508 931. (LBH)

  16. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  17. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  18. Writing robust C++ code for critical applications

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **C++** is one of the most **complex**, expressive and powerful languages out there. However, its complexity makes it hard to write **robust** code. When using C++ to code **critical** applications, ensuring **reliability** is one of the key topics. Testing, debugging and profiling are all a major part of this kind of work. In the BE department we use C++ to write a big part of the controls system for beam operation, which implies putting a big focus on system stability and ensuring smooth operation. This talk will try to: - Highlight potential problems when writing C++ code, giving guidelines on writing defensive code that could have avoided such issues - Explain how to avoid common pitfalls (both in writing C++ code and at the debugging & profiling phase) - Showcase some tools and tricks useful to C++ development The attendees' proficiency in C++ should not be a concern. Anyone is free to join, even people that do not know C++, if only to learn the pitfalls a language may have. This may benefit f...

  19. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  20. What to do with a Dead Research Code

    Science.gov (United States)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  1. Current and proposed revisions, changes, and modifications to American codes and standards to address packaging, handling, and transportation of radioactive materials and how they relate to comparable international regulations

    International Nuclear Information System (INIS)

    Borter, W.H.; Froehlich, C.H.

    2004-01-01

    This paper addresses current and proposed revisions, additions, and modifications to American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (BPVC) (i.e., ''ASMEthe Code'') Section III, Division 3 and American National Standards Institute (ANSI)/ASME N14.6. It provides insight into the ongoing processes of the associated committees and highlights important revisions, changes, and modifications to this Code and Standard. The ASME Code has developed and issued Division 3 to address items associated with the transportation and storage of radioactive materials. It currently only addresses ''General Requirements'' in Subsections WA and ''Class TP (Type B) Containments'' (Transportation Packages) in Subsection WB, but is in the process of adding a new Subsection WC to address ''Class SC'' (Storage Containments). ANSI/ASME Standard N14.6 which interacts with components constructed to Division 3 by addressinges special lifting devices for radioactive material shipping containers. This Standard is in the process of a complete re-write. This Code and Standard can be classified as ''dynamic'' in that their committees meet at least four times a year to evaluate proposed modifications and additions that reflect current safety practices in the nuclear industry. These evaluations include the possible addition of new materials, fabrication processes, examination methods, and testing requirements. An overview of this ongoing process is presented in this paper along with highlights of the more important proposed revisions, changes, and modifications and how they relate to United States (US) and international regulations and guidance like International Atomic Energy Agency (IAEA) Requirement No. TS-R-1

  2. Tensor Dictionary Learning for Positive Definite Matrices.

    Science.gov (United States)

    Sivalingam, Ravishankar; Boley, Daniel; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2015-11-01

    Sparse models have proven to be extremely successful in image processing and computer vision. However, a majority of the effort has been focused on sparse representation of vectors and low-rank models for general matrices. The success of sparse modeling, along with popularity of region covariances, has inspired the development of sparse coding approaches for these positive definite descriptors. While in earlier work, the dictionary was formed from all, or a random subset of, the training signals, it is clearly advantageous to learn a concise dictionary from the entire training set. In this paper, we propose a novel approach for dictionary learning over positive definite matrices. The dictionary is learned by alternating minimization between sparse coding and dictionary update stages, and different atom update methods are described. A discriminative version of the dictionary learning approach is also proposed, which simultaneously learns dictionaries for different classes in classification or clustering. Experimental results demonstrate the advantage of learning dictionaries from data both from reconstruction and classification viewpoints. Finally, a software library is presented comprising C++ binaries for all the positive definite sparse coding and dictionary learning approaches presented here.

  3. Current Capability of Atomic Structure Theory

    International Nuclear Information System (INIS)

    Kim, Yong Ki

    1993-01-01

    Current capability of atomic structure theory is reviewed, and advantages, disadvantages and major features of popular atomic structure codes described. Comparisons between theoretical and experimental data on transition energies and lifetimes of excited levels are presented to illustrate the current capability of atomic structure codes.

  4. Developing improved MD codes for understanding processive cellulases

    International Nuclear Information System (INIS)

    Crowley, M F; Nimlos, M R; Himmel, M E; Uberbacher, E C; Iii, C L Brooks; Walker, R C

    2008-01-01

    The mechanism of action of cellulose-degrading enzymes is illuminated through a multidisciplinary collaboration that uses molecular dynamics (MD) simulations and expands the capabilities of MD codes to allow simulations of enzymes and substrates on petascale computational facilities. There is a class of glycoside hydrolase enzymes called cellulases that are thought to decrystallize and processively depolymerize cellulose using biochemical processes that are largely not understood. Understanding the mechanisms involved and improving the efficiency of this hydrolysis process through computational models and protein engineering presents a compelling grand challenge. A detailed understanding of cellulose structure, dynamics and enzyme function at the molecular level is required to direct protein engineers to the right modifications or to understand if natural thermodynamic or kinetic limits are in play. Much can be learned about processivity by conducting carefully designed molecular dynamics (MD) simulations of the binding and catalytic domains of cellulases with various substrate configurations, solvation models and thermodynamic protocols. Most of these numerical experiments, however, will require significant modification of existing code and algorithms in order to efficiently use current (terascale) and future (petascale) hardware to the degree of parallelism necessary to simulate a system of the size proposed here. This work will develop MD codes that can efficiently use terascale and petascale systems, not just for simple classical MD simulations, but also for more advanced methods, including umbrella sampling with complex restraints and reaction coordinates, transition path sampling, steered molecular dynamics, and quantum mechanical/molecular mechanical simulations of systems the size of cellulose degrading enzymes acting on cellulose

  5. The Current Mental State of School Students in Online Learning Conditions

    Directory of Open Access Journals (Sweden)

    Kovalevskaya E.V.,

    2015-08-01

    Full Text Available This article discusses the results of a study of actual mental state of high school students who are active subjects of career self-determination in terms of interactive learning. There are four groups of methods of interactive training: psychological training, art therapy, cognitive, and game training. The main task, which is solved by a researcher in a formative experiment with the use of each of these methods, is to establish significant differences in health, activity and mood as the indicators of current mental state of students in the classroom. As a result, we found that the most significant improvements in the current mental state takes place when using art and game therapy, so these techniques should be used in groups of students with low motivation to work, as well as in the adverse psychological climate. Less significant was the improvement of the current mental state after psychological training due to the fact that this method allow to update and seek solutions to the most important intrapersonal issues and require the implementation of a deeper reflection

  6. SCDAP/RELAP5 code development and assessment

    International Nuclear Information System (INIS)

    Allison, C.M.; Hohorst, J.K.

    1996-01-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The current version of the code is SCDAP/RELAP5/MOD3.1e. Although MOD3.1e contains a number of significant improvements since the initial version of MOD3.1 was released, new models to treat the behavior of the fuel and cladding during reflood have had the most dramatic impact on the code's calculations. This paper provides a brief description of the new reflood models, presents highlights of the assessment of the current version of MOD3.1, and discusses future SCDAP/RELAP5/MOD3.2 model development activities

  7. A restructuring of TF package for MIDAS computer code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Kim, D. H.

    2002-01-01

    TF package which defines some interpolation and extrapolation condition through user defined table has been restructured in MIDAS computer code. To do this, data transferring methods of current MELCOR code are modified and adopted into TF package. The data structure of the current MELCOR code using FORTRAN77 causes a difficult grasping of the meaning of the variables as well as waste of memory. New features of FORTRAN90 make it possible to allocate the storage dynamically and to use the user-defined data type, which lead to an efficient memory treatment and an easy understanding of the code. Restructuring of TF package addressed in this paper does module development and subroutine modification, and treats MELGEN which is making restart file as well as MELCOR which is processing calculation. The validation has been done by comparing the results of the modified code with those from the existing code, and it is confirmed that the results are the same. It hints that the similar approach could be extended to the entire code package. It is expected that code restructuring will accelerate the code domestication thanks to direct understanding of each variable and easy implementation of modified or newly developed models

  8. Distributed Learning, Recognition, and Prediction by ART and ARTMAP Neural Networks.

    Science.gov (United States)

    Carpenter, Gail A.

    1997-11-01

    A class of adaptive resonance theory (ART) models for learning, recognition, and prediction with arbitrarily distributed code representations is introduced. Distributed ART neural networks combine the stable fast learning capabilities of winner-take-all ART systems with the noise tolerance and code compression capabilities of multilayer perceptrons. With a winner-take-all code, the unsupervised model dART reduces to fuzzy ART and the supervised model dARTMAP reduces to fuzzy ARTMAP. With a distributed code, these networks automatically apportion learned changes according to the degree of activation of each coding node, which permits fast as well as slow learning without catastrophic forgetting. Distributed ART models replace the traditional neural network path weight with a dynamic weight equal to the rectified difference between coding node activation and an adaptive threshold. Thresholds increase monotonically during learning according to a principle of atrophy due to disuse. However, monotonic change at the synaptic level manifests itself as bidirectional change at the dynamic level, where the result of adaptation resembles long-term potentiation (LTP) for single-pulse or low frequency test inputs but can resemble long-term depression (LTD) for higher frequency test inputs. This paradoxical behavior is traced to dual computational properties of phasic and tonic coding signal components. A parallel distributed match-reset-search process also helps stabilize memory. Without the match-reset-search system, dART becomes a type of distributed competitive learning network.

  9. Speech and audio processing for coding, enhancement and recognition

    CERN Document Server

    Togneri, Roberto; Narasimha, Madihally

    2015-01-01

    This book describes the basic principles underlying the generation, coding, transmission and enhancement of speech and audio signals, including advanced statistical and machine learning techniques for speech and speaker recognition with an overview of the key innovations in these areas. Key research undertaken in speech coding, speech enhancement, speech recognition, emotion recognition and speaker diarization are also presented, along with recent advances and new paradigms in these areas. ·         Offers readers a single-source reference on the significant applications of speech and audio processing to speech coding, speech enhancement and speech/speaker recognition. Enables readers involved in algorithm development and implementation issues for speech coding to understand the historical development and future challenges in speech coding research; ·         Discusses speech coding methods yielding bit-streams that are multi-rate and scalable for Voice-over-IP (VoIP) Networks; ·     �...

  10. Visual search asymmetries within color-coded and intensity-coded displays.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2010-06-01

    Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  11. Tokamak Simulation Code modeling of NSTX

    International Nuclear Information System (INIS)

    Jardin, S.C.; Kaye, S.; Menard, J.; Kessel, C.; Glasser, A.H.

    2000-01-01

    The Tokamak Simulation Code [TSC] is widely used for the design of new axisymmetric toroidal experiments. In particular, TSC was used extensively in the design of the National Spherical Torus eXperiment [NSTX]. The authors have now benchmarked TSC with initial NSTX results and find excellent agreement for plasma and vessel currents and magnetic flux loops when the experimental coil currents are used in the simulations. TSC has also been coupled with a ballooning stability code and with DCON to provide stability predictions for NSTX operation. TSC has also been used to model initial CHI experiments where a large poloidal voltage is applied to the NSTX vacuum vessel, causing a force-free current to appear in the plasma. This is a phenomenon that is similar to the plasma halo current that sometimes develops during a plasma disruption

  12. VOA: a 2-d plasma physics code

    International Nuclear Information System (INIS)

    Eltgroth, P.G.

    1975-12-01

    A 2-dimensional relativistic plasma physics code was written and tested. The non-thermal components of the particle distribution functions are represented by expansion into moments in momentum space. These moments are computed directly from numerical equations. Currently three species are included - electrons, ions and ''beam electrons''. The computer code runs on either the 7600 or STAR machines at LLL. Both the physics and the operation of the code are discussed

  13. Country Report on Building Energy Codes in China

    Energy Technology Data Exchange (ETDEWEB)

    Shui, Bin; Evans, Meredydd; Lin, H.; Jiang, Wei; Liu, Bing; Song, Bo; Somasundaram, Sriram

    2009-04-15

    This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in China, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope and HVAC) for commercial and residential buildings in China.

  14. Country Report on Building Energy Codes in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Evans, Meredydd; Shui, Bin; Takagi, T.

    2009-04-15

    This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in Japan, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, and lighting) for commercial and residential buildings in Japan.

  15. Country Report on Building Energy Codes in Australia

    Energy Technology Data Exchange (ETDEWEB)

    Shui, Bin; Evans, Meredydd; Somasundaram, Sriram

    2009-04-02

    This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in Australia, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, and lighting) for commercial and residential buildings in Australia.

  16. Country Report on Building Energy Codes in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Shui, Bin; Evans, Meredydd

    2009-04-06

    This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America . This reports gives an overview of the development of building energy codes in Canada, including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, lighting, and water heating) for commercial and residential buildings in Canada.

  17. Tokamak plasma power balance calculation code (TPC code) outline and operation manual

    International Nuclear Information System (INIS)

    Fujieda, Hirobumi; Murakami, Yoshiki; Sugihara, Masayoshi.

    1992-11-01

    This report is a detailed description on the TPC code, that calculates the power balance of a tokamak plasma according to the ITER guidelines. The TPC code works on a personal computer (Macintosh or J-3100/ IBM-PC). Using input data such as the plasma shape, toroidal magnetic field, plasma current, electron temperature, electron density, impurities and heating power, TPC code can determine the operation point of the fusion reactor (Ion temperature is assumed to be equal to the electron temperature). Supplied flux (Volt · sec) and burn time are also estimated by coil design parameters. Calculated energy confinement time is compared with various L-mode scaling laws and the confinement enhancement factor (H-factor) is evaluated. Divertor heat load is predicted by using simple scaling models (constant-χ, Bohm-type-χ and JT-60U empirical scaling models). Frequently used data can be stored in a 'device file' and used as the default values. TPC code can generate 2-D mesh data and the POPCON plot is drawn by a contour line plotting program (CONPLT). The operation manual about CONPLT code is also described. (author)

  18. Learning dictionaries of sparse codes of 3D movements of body joints for real-time human activity understanding.

    Science.gov (United States)

    Qi, Jin; Yang, Zhiyong

    2014-01-01

    Real-time human activity recognition is essential for human-robot interactions for assisted healthy independent living. Most previous work in this area is performed on traditional two-dimensional (2D) videos and both global and local methods have been used. Since 2D videos are sensitive to changes of lighting condition, view angle, and scale, researchers begun to explore applications of 3D information in human activity understanding in recently years. Unfortunately, features that work well on 2D videos usually don't perform well on 3D videos and there is no consensus on what 3D features should be used. Here we propose a model of human activity recognition based on 3D movements of body joints. Our method has three steps, learning dictionaries of sparse codes of 3D movements of joints, sparse coding, and classification. In the first step, space-time volumes of 3D movements of body joints are obtained via dense sampling and independent component analysis is then performed to construct a dictionary of sparse codes for each activity. In the second step, the space-time volumes are projected to the dictionaries and a set of sparse histograms of the projection coefficients are constructed as feature representations of the activities. Finally, the sparse histograms are used as inputs to a support vector machine to recognize human activities. We tested this model on three databases of human activities and found that it outperforms the state-of-the-art algorithms. Thus, this model can be used for real-time human activity recognition in many applications.

  19. Learning dictionaries of sparse codes of 3D movements of body joints for real-time human activity understanding.

    Directory of Open Access Journals (Sweden)

    Jin Qi

    Full Text Available Real-time human activity recognition is essential for human-robot interactions for assisted healthy independent living. Most previous work in this area is performed on traditional two-dimensional (2D videos and both global and local methods have been used. Since 2D videos are sensitive to changes of lighting condition, view angle, and scale, researchers begun to explore applications of 3D information in human activity understanding in recently years. Unfortunately, features that work well on 2D videos usually don't perform well on 3D videos and there is no consensus on what 3D features should be used. Here we propose a model of human activity recognition based on 3D movements of body joints. Our method has three steps, learning dictionaries of sparse codes of 3D movements of joints, sparse coding, and classification. In the first step, space-time volumes of 3D movements of body joints are obtained via dense sampling and independent component analysis is then performed to construct a dictionary of sparse codes for each activity. In the second step, the space-time volumes are projected to the dictionaries and a set of sparse histograms of the projection coefficients are constructed as feature representations of the activities. Finally, the sparse histograms are used as inputs to a support vector machine to recognize human activities. We tested this model on three databases of human activities and found that it outperforms the state-of-the-art algorithms. Thus, this model can be used for real-time human activity recognition in many applications.

  20. Bi-level image compression with tree coding

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1996-01-01

    Presently, tree coders are the best bi-level image coders. The current ISO standard, JBIG, is a good example. By organising code length calculations properly a vast number of possible models (trees) can be investigated within reasonable time prior to generating code. Three general-purpose coders...... are constructed by this principle. A multi-pass free tree coding scheme produces superior compression results for all test images. A multi-pass fast free template coding scheme produces much better results than JBIG for difficult images, such as halftonings. Rissanen's algorithm `Context' is presented in a new...

  1. MIDAS/PK code development using point kinetics model

    International Nuclear Information System (INIS)

    Song, Y. M.; Park, S. H.

    1999-01-01

    In this study, a MIDAS/PK code has been developed for analyzing the ATWS (Anticipated Transients Without Scram) which can be one of severe accident initiating events. The MIDAS is an integrated computer code based on the MELCOR code to develop a severe accident risk reduction strategy by Korea Atomic Energy Research Institute. In the mean time, the Chexal-Layman correlation in the current MELCOR, which was developed under a BWR condition, is appeared to be inappropriate for a PWR. So as to provide ATWS analysis capability to the MIDAS code, a point kinetics module, PKINETIC, has first been developed as a stand-alone code whose reference model was selected from the current accident analysis codes. In the next step, the MIDAS/PK code has been developed via coupling PKINETIC with the MIDAS code by inter-connecting several thermal hydraulic parameters between the two codes. Since the major concern in the ATWS analysis is the primary peak pressure during the early few minutes into the accident, the peak pressure from the PKINETIC module and the MIDAS/PK are compared with the RETRAN calculations showing a good agreement between them. The MIDAS/PK code is considered to be valuable for analyzing the plant response during ATWS deterministically, especially for the early domestic Westinghouse plants which rely on the operator procedure instead of an AMSAC (ATWS Mitigating System Actuation Circuitry) against ATWS. This capability of ATWS analysis is also important from the view point of accident management and mitigation

  2. Learning design: reflections upon the current landscape

    Directory of Open Access Journals (Sweden)

    Brock Craft

    2012-08-01

    Full Text Available The mounting wealth of open and readily available information and the accelerated evolution of social, mobile and creative technologies call for a re-conceptualisation of the role of educators: from providers of knowledge to designers of learning. This call is reverberated by the rising trend of research in learning design (LD. Addressing this, the Art and Science of Learning Design workshop brought together leading voices in the field, and provided a forum for discussing its key issues. It focused on three major themes: (1 practices, methods and methodologies, (2 tools and resources and (3 theoretical frameworks. This paper proposes a definition of LD, reviews the main contributions from the workshop, and suggests some challenges for future research.

  3. The Current Perspectives, Theories and Practices of Mobile Learning

    Science.gov (United States)

    Keskin, Nilgun Ozdamar; Metcalf, David

    2011-01-01

    Mobile learning (m-learning) is a highly popular multidisciplinary study field around the world. It has attracted a great deal of attention from researchers in different disciplines who have realized the potential to apply mobile technologies to enhance learning. Thus, mobile learning has been defined differently by different people. This study is…

  4. Tree Coding of Bilevel Images

    DEFF Research Database (Denmark)

    Martins, Bo; Forchhammer, Søren

    1998-01-01

    Presently, sequential tree coders are the best general purpose bilevel image coders and the best coders of halftoned images. The current ISO standard, Joint Bilevel Image Experts Group (JBIG), is a good example. A sequential tree coder encodes the data by feeding estimates of conditional...... is one order of magnitude slower than JBIG, obtains excellent and highly robust compression performance. A multipass free tree coding scheme produces superior compression results for all test images. A multipass free template coding scheme produces significantly better results than JBIG for difficult...... images such as halftones. By utilizing randomized subsampling in the template selection, the speed becomes acceptable for practical image coding...

  5. Alternatively Constrained Dictionary Learning For Image Superresolution.

    Science.gov (United States)

    Lu, Xiaoqiang; Yuan, Yuan; Yan, Pingkun

    2014-03-01

    Dictionaries are crucial in sparse coding-based algorithm for image superresolution. Sparse coding is a typical unsupervised learning method to study the relationship between the patches of high-and low-resolution images. However, most of the sparse coding methods for image superresolution fail to simultaneously consider the geometrical structure of the dictionary and the corresponding coefficients, which may result in noticeable superresolution reconstruction artifacts. In other words, when a low-resolution image and its corresponding high-resolution image are represented in their feature spaces, the two sets of dictionaries and the obtained coefficients have intrinsic links, which has not yet been well studied. Motivated by the development on nonlocal self-similarity and manifold learning, a novel sparse coding method is reported to preserve the geometrical structure of the dictionary and the sparse coefficients of the data. Moreover, the proposed method can preserve the incoherence of dictionary entries and provide the sparse coefficients and learned dictionary from a new perspective, which have both reconstruction and discrimination properties to enhance the learning performance. Furthermore, to utilize the model of the proposed method more effectively for single-image superresolution, this paper also proposes a novel dictionary-pair learning method, which is named as two-stage dictionary training. Extensive experiments are carried out on a large set of images comparing with other popular algorithms for the same purpose, and the results clearly demonstrate the effectiveness of the proposed sparse representation model and the corresponding dictionary learning algorithm.

  6. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  7. Seismic Signal Compression Using Nonparametric Bayesian Dictionary Learning via Clustering

    Directory of Open Access Journals (Sweden)

    Xin Tian

    2017-06-01

    Full Text Available We introduce a seismic signal compression method based on nonparametric Bayesian dictionary learning method via clustering. The seismic data is compressed patch by patch, and the dictionary is learned online. Clustering is introduced for dictionary learning. A set of dictionaries could be generated, and each dictionary is used for one cluster’s sparse coding. In this way, the signals in one cluster could be well represented by their corresponding dictionaries. A nonparametric Bayesian dictionary learning method is used to learn the dictionaries, which naturally infers an appropriate dictionary size for each cluster. A uniform quantizer and an adaptive arithmetic coding algorithm are adopted to code the sparse coefficients. With comparisons to other state-of-the art approaches, the effectiveness of the proposed method could be validated in the experiments.

  8. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  9. Aids to Computer-Based Multimedia Learning.

    Science.gov (United States)

    Mayer, Richard E.; Moreno, Roxana

    2002-01-01

    Presents a cognitive theory of multimedia learning that draws on dual coding theory, cognitive load theory, and constructivist learning theory and derives some principles of instructional design for fostering multimedia learning. These include principles of multiple representation, contiguity, coherence, modality, and redundancy. (SLD)

  10. A three-dimensional magnetostatics computer code for insertion devices

    International Nuclear Information System (INIS)

    Chubar, O.; Elleaume, P.; Chavanne, J.

    1998-01-01

    RADIA is a three-dimensional magnetostatics computer code optimized for the design of undulators and wigglers. It solves boundary magnetostatics problems with magnetized and current-carrying volumes using the boundary integral approach. The magnetized volumes can be arbitrary polyhedrons with non-linear (iron) or linear anisotropic (permanent magnet) characteristics. The current-carrying elements can be straight or curved blocks with rectangular cross sections. Boundary conditions are simulated by the technique of mirroring. Analytical formulae used for the computation of the field produced by a magnetized volume of a polyhedron shape are detailed. The RADIA code is written in object-oriented C++ and interfaced to Mathematica (Mathematica is a registered trademark of Wolfram Research, Inc.). The code outperforms currently available finite-element packages with respect to the CPU time of the solver and accuracy of the field integral estimations. An application of the code to the case of a wedge-pole undulator is presented

  11. Structuring and coding in health care records: a qualitative analysis using diabetes as a case study

    Directory of Open Access Journals (Sweden)

    Ann R R Robertson

    2015-03-01

    Full Text Available Background   Globally, diabetes mellitus presents a substantial burden to individuals and healthcare systems. Structuring and/or coding of medical records underpin attempts to improve information sharing and searching, potentially bringing clinical and secondary uses benefits.Aims and objectives   We investigated if, how and why records for adults with diabetes were structured and/or coded, and explored stakeholders’ perceptions of current practice.Methods   We carried out a qualitative, theoretically-informed case study of documenting healthcare information for diabetes patients in family practice and hospital settings, using semi-structured interviews, observations, systems demonstrations and documentary data.Results   We conducted 22 interviews and four on-site observations, and reviewed 25 documents. For secondary uses – research, audit, public health and service planning – the benefits of highly structured and coded diabetes data were clearly articulated. Reported clinical benefits in terms of managing and monitoring diabetes, and perhaps encouraging patient self-management, were modest. We observed marked differences in levels of record structuring and/or coding between settings, and found little evidence that these data were being exploited to improve information sharing between them.Conclusions   Using high levels of data structuring and coding in medical records for diabetes patients has potential to be exploited more fully, and lessons might be learned from successful developments elsewhere in the UK.

  12. The chronotron: a neuron that learns to fire temporally precise spike patterns.

    Directory of Open Access Journals (Sweden)

    Răzvan V Florian

    Full Text Available In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons, one that provides high memory capacity (E-learning, and one that has a higher biological plausibility (I-learning. With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.

  13. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    International Nuclear Information System (INIS)

    Etienne, Zachariah B; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L

    2015-01-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores. (paper)

  14. Current breathomics-a review on data pre-processing techniques and machine learning in metabolomics breath analysis

    DEFF Research Database (Denmark)

    Smolinska, A.; Hauschild, A. C.; Fijten, R. R. R.

    2014-01-01

    been extensively developed. Yet, the application of machine learning methods for fingerprinting VOC profiles in the breathomics is still in its infancy. Therefore, in this paper, we describe the current state of the art in data pre-processing and multivariate analysis of breathomics data. We start...... different conditions (e.g. disease stage, treatment). Independently of the utilized analytical method, the most important question, 'which VOCs are discriminatory?', remains the same. Answers can be given by several modern machine learning techniques (multivariate statistics) and, therefore, are the focus...

  15. A restructuring of COR package for MIDAS computer code

    International Nuclear Information System (INIS)

    Park, S.H.; Kim, K.R.; Kim, D.H.

    2004-01-01

    The COR package, which calculates the thermal response of the core and the lower plenum internal structures and models the relocation of the core and lower plenum structural materials, has been restructured for the MIDAS computer code. MIDAS is being developed as an integrated severe accident analysis code with a user-friendly graphical user interface and a modernized data structure. To do this, the data transferring methods of the current MELCOR code are modified and adopted into the COR package. The data structure of the current MELCOR code using FORTRAN77 has a difficulty in grasping the meaning of the variables as well as a waste of memory. New features of FORTRAN90 make it possible to allocate the storage dynamically and to use the user-defined data type, which leads to an efficient memory treatment and an easy understanding of the code. Restructuring of the COR package addressed in this paper includes a module development, subroutine modification. The verification has been done by comparing the results of the modified code with those of the existing code. As the trends are similar to each other, it implies that the same approach could be extended to the entire code package. It is expected that the code restructuring will accelerated the code's domestication thanks to a direct understanding of each variable and an easy implementation of the modified or newly developed models. (author)

  16. The GNASH preequilibrium-statistical nuclear model code

    International Nuclear Information System (INIS)

    Arthur, E. D.

    1988-01-01

    The following report is based on materials presented in a series of lectures at the International Center for Theoretical Physics, Trieste, which were designed to describe the GNASH preequilibrium statistical model code and its use. An overview is provided of the code with emphasis upon code's calculational capabilities and the theoretical models that have been implemented in it. Two sample problems are discussed, the first dealing with neutron reactions on 58 Ni. the second illustrates the fission model capabilities implemented in the code and involves n + 235 U reactions. Finally a description is provided of current theoretical model and code development underway. Examples of calculated results using these new capabilities are also given. 19 refs., 17 figs., 3 tabs

  17. Double-digit coding of examination math problems

    Directory of Open Access Journals (Sweden)

    Agnieszka Sułowska

    2013-09-01

    Full Text Available Various methods are used worldwide to evaluate student solutions to examination tasks. Usually the results simply provide information about student competency and after aggregation, are also used as a tool of making comparisons between schools. In particular, the standard evaluation methods do not allow conclusions to be drawn about possible improvements of teaching methods. There are however, task assessment methods which not only allow description of student achievement, but also possible causes of failure. One such method, which can be applied to extended response tasks, is double-digit coding which has been used in some international educational research. This paper presents the first Polish experiences of applying this method to examination tasks in mathematics, using a special coding key to carry out the evaluation. Lessons learned during the coding key construction and its application in the assessment process are described.

  18. Improving coding accuracy in an academic practice.

    Science.gov (United States)

    Nguyen, Dana; O'Mara, Heather; Powell, Robert

    2017-01-01

    Practice management has become an increasingly important component of graduate medical education. This applies to every practice environment; private, academic, and military. One of the most critical aspects of practice management is documentation and coding for physician services, as they directly affect the financial success of any practice. Our quality improvement project aimed to implement a new and innovative method for teaching billing and coding in a longitudinal fashion in a family medicine residency. We hypothesized that implementation of a new teaching strategy would increase coding accuracy rates among residents and faculty. Design: single group, pretest-posttest. military family medicine residency clinic. Study populations: 7 faculty physicians and 18 resident physicians participated as learners in the project. Educational intervention: monthly structured coding learning sessions in the academic curriculum that involved learner-presented cases, small group case review, and large group discussion. overall coding accuracy (compliance) percentage and coding accuracy per year group for the subjects that were able to participate longitudinally. Statistical tests used: average coding accuracy for population; paired t test to assess improvement between 2 intervention periods, both aggregate and by year group. Overall coding accuracy rates remained stable over the course of time regardless of the modality of the educational intervention. A paired t test was conducted to compare coding accuracy rates at baseline (mean (M)=26.4%, SD=10%) to accuracy rates after all educational interventions were complete (M=26.8%, SD=12%); t24=-0.127, P=.90. Didactic teaching and small group discussion sessions did not improve overall coding accuracy in a residency practice. Future interventions could focus on educating providers at the individual level.

  19. The code of ethics for nurses.

    Science.gov (United States)

    Zahedi, F; Sanjari, M; Aala, M; Peymani, M; Aramesh, K; Parsapour, A; Maddah, Ss Bagher; Cheraghi, Ma; Mirzabeigi, Gh; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Nurses are ever-increasingly confronted with complex concerns in their practice. Codes of ethics are fundamental guidance for nursing as many other professions. Although there are authentic international codes of ethics for nurses, the national code would be the additional assistance provided for clinical nurses in their complex roles in care of patients, education, research and management of some parts of health care system in the country. A national code can provide nurses with culturally-adapted guidance and help them to make ethical decisions more closely to the Iranian-Islamic background. Given the general acknowledgement of the need, the National Code of Ethics for Nurses was compiled as a joint project (2009-2011). The Code was approved by the Health Policy Council of the Ministry of Health and Medical Education and communicated to all universities, healthcare centers, hospitals and research centers early in 2011. The focus of this article is on the course of action through which the Code was compiled, amended and approved. The main concepts of the code will be also presented here. No doubt, development of the codes should be considered as an ongoing process. This is an overall responsibility to keep the codes current, updated with the new progresses of science and emerging challenges, and pertinent to the nursing practice.

  20. Code of practice in industrial radiography

    International Nuclear Information System (INIS)

    Karma, S. E. M.

    2010-12-01

    The aim of this research is to developing a draft for a new radiation protection code of practice in industrial radiography without ignoring that one issued in 1998 and meet the current international recommendation. Another aim of this study was to assess the current situation of radiation protection in some of the industrial radiography department in Sudan. To achieve the aims of this study, a draft of a code of practice has been developed which is based on international and local relevant recommendations. The developed code includes the following main issues: regulatory responsibilities, radiation protection program and design of radiation installation. The practical part of this study includes scientific visits to two of industrial radiography departments in Sudan so as to assess the degree of compliance of that department with what state in the developed code. The result of each scientific visits revealed that most of the department do not have an effective radiation protection program and that could lead to exposure workers and public to unnecessary dose. Some recommendations were stated that, if implemented could improve the status of radiation protection in industrial radiography department. (Author)

  1. Student perception of travel service learning experience in Morocco.

    Science.gov (United States)

    Puri, Aditi; Kaddoura, Mahmoud; Dominick, Christine

    2013-08-01

    This study explores the perceptions of health profession students participating in academic service learning in Morocco with respect to adapting health care practices to cultural diversity. Authors utilized semi-structured, open-ended interviews to explore the perceptions of health profession students. Nine dental hygiene and nursing students who traveled to Morocco to provide oral and general health services were interviewed. After interviews were recorded, they were transcribed verbatim to ascertain descriptive validity and to generate inductive and deductive codes that constitute the major themes of the data analysis. Thereafter, NVIVO 8 was used to rapidly determine the frequency of applied codes. The authors compared the codes and themes to establish interpretive validity. Codes and themes were initially determined independently by co-authors and applied to the data subsequently. The authors compared the applied codes to establish intra-rater reliability. International service learning experiences led to perceptions of growth as a health care provider among students. The application of knowledge and skills learned in academic programs and service learning settings were found to help in bridging the theory-practice gap. The specific experience enabled students to gain an understanding of diverse health care and cultural practices in Morocco. Students perceived that the experience gained in international service learning can heighten awareness of diverse cultural and health care practices to foster professional growth of health professionals.

  2. The impact of cerebellar transcranial direct current stimulation (tDCS) on learning fine-motor sequences.

    Science.gov (United States)

    Shimizu, Renee E; Wu, Allan D; Samra, Jasmine K; Knowlton, Barbara J

    2017-01-05

    The cerebellum has been shown to be important for skill learning, including the learning of motor sequences. We investigated whether cerebellar transcranial direct current stimulation (tDCS) would enhance learning of fine motor sequences. Because the ability to generalize or transfer to novel task variations or circumstances is a crucial goal of real world training, we also examined the effect of tDCS on performance of novel sequences after training. In Study 1, participants received either anodal, cathodal or sham stimulation while simultaneously practising three eight-element key press sequences in a non-repeating, interleaved order. Immediately after sequence practice with concurrent tDCS, a transfer session was given in which participants practised three interleaved novel sequences. No stimulation was given during transfer. An inhibitory effect of cathodal tDCS was found during practice, such that the rate of learning was slowed in comparison to the anodal and sham groups. In Study 2, participants received anodal or sham stimulation and a 24 h delay was added between the practice and transfer sessions to reduce mental fatigue. Although this consolidation period benefitted subsequent transfer for both tDCS groups, anodal tDCS enhanced transfer performance. Together, these studies demonstrate polarity-specific effects on fine motor sequence learning and generalization.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  3. EquiFACS: The Equine Facial Action Coding System.

    Directory of Open Access Journals (Sweden)

    Jen Wathan

    Full Text Available Although previous studies of horses have investigated their facial expressions in specific contexts, e.g. pain, until now there has been no methodology available that documents all the possible facial movements of the horse and provides a way to record all potential facial configurations. This is essential for an objective description of horse facial expressions across a range of contexts that reflect different emotional states. Facial Action Coding Systems (FACS provide a systematic methodology of identifying and coding facial expressions on the basis of underlying facial musculature and muscle movement. FACS are anatomically based and document all possible facial movements rather than a configuration of movements associated with a particular situation. Consequently, FACS can be applied as a tool for a wide range of research questions. We developed FACS for the domestic horse (Equus caballus through anatomical investigation of the underlying musculature and subsequent analysis of naturally occurring behaviour captured on high quality video. Discrete facial movements were identified and described in terms of the underlying muscle contractions, in correspondence with previous FACS systems. The reliability of others to be able to learn this system (EquiFACS and consistently code behavioural sequences was high--and this included people with no previous experience of horses. A wide range of facial movements were identified, including many that are also seen in primates and other domestic animals (dogs and cats. EquiFACS provides a method that can now be used to document the facial movements associated with different social contexts and thus to address questions relevant to understanding social cognition and comparative psychology, as well as informing current veterinary and animal welfare practices.

  4. Recent advances in coding theory for near error-free communications

    Science.gov (United States)

    Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.

    1991-01-01

    Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.

  5. Learning about Learning: A Conundrum and a Possible Resolution

    Science.gov (United States)

    Barnett, Ronald

    2011-01-01

    What is it to learn in the modern world? We can identify four "learning epochs" through which our understanding of learning has passed: a metaphysical view; an empirical view; an experiential view; and, currently, a "learning-amid-contestation" view. In this last and current view, learning has its place in a world in which, the more one learns,…

  6. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  7. Machine-Learning Algorithms to Code Public Health Spending Accounts.

    Science.gov (United States)

    Brady, Eoghan S; Leider, Jonathon P; Resnick, Beth A; Alfonso, Y Natalia; Bishai, David

    Government public health expenditure data sets require time- and labor-intensive manipulation to summarize results that public health policy makers can use. Our objective was to compare the performances of machine-learning algorithms with manual classification of public health expenditures to determine if machines could provide a faster, cheaper alternative to manual classification. We used machine-learning algorithms to replicate the process of manually classifying state public health expenditures, using the standardized public health spending categories from the Foundational Public Health Services model and a large data set from the US Census Bureau. We obtained a data set of 1.9 million individual expenditure items from 2000 to 2013. We collapsed these data into 147 280 summary expenditure records, and we followed a standardized method of manually classifying each expenditure record as public health, maybe public health, or not public health. We then trained 9 machine-learning algorithms to replicate the manual process. We calculated recall, precision, and coverage rates to measure the performance of individual and ensembled algorithms. Compared with manual classification, the machine-learning random forests algorithm produced 84% recall and 91% precision. With algorithm ensembling, we achieved our target criterion of 90% recall by using a consensus ensemble of ≥6 algorithms while still retaining 93% coverage, leaving only 7% of the summary expenditure records unclassified. Machine learning can be a time- and cost-saving tool for estimating public health spending in the United States. It can be used with standardized public health spending categories based on the Foundational Public Health Services model to help parse public health expenditure information from other types of health-related spending, provide data that are more comparable across public health organizations, and evaluate the impact of evidence-based public health resource allocation.

  8. ASME nuclear codes and standards: Recent technical initiatives

    International Nuclear Information System (INIS)

    Feigel, R. E.

    1995-01-01

    Although nuclear power construction is currently in a hiatus in the US, ASME and its volunteer committees remain committed to continual improvements in the technical requirements in its nuclear codes. This paper provides an overview of several significant recent revisions to ASME' s nuclear codes. Additionally, other important initiatives currently being addressed by ASME committees will be described. With the largest population of operating light water nuclear plants in the world and worldwide use of its nuclear codes, ASME continues to support technical advancements in its nuclear codes and standards. While revisions of various magnitude are an ongoing process, several recent revisions embody significant changes based on state of the art design philosophy and substantial industry experience. In the design area, a significant revisions has recently been approved which will significantly reduce conservatisms in seismic piping design as well as provide simplified design rules. Major revisions have also been made to the requirements for nuclear material manufacturers and suppliers, which should result in clearer understanding of this difficult administrative area of the code. In the area of Section XI inservice rules, substantial studies are underway to investigate the application of probabilistic, risked based inspection in lieu of the current deterministic inspection philosophy. While much work still is required in this area, it is an important potential application of the emerging field of risk based inspection

  9. Cognitive Architectures for Multimedia Learning

    Science.gov (United States)

    Reed, Stephen K.

    2006-01-01

    This article provides a tutorial overview of cognitive architectures that can form a theoretical foundation for designing multimedia instruction. Cognitive architectures include a description of memory stores, memory codes, and cognitive operations. Architectures that are relevant to multimedia learning include Paivio's dual coding theory,…

  10. Coding and transmission of subband coded images on the Internet

    Science.gov (United States)

    Wah, Benjamin W.; Su, Xiao

    2001-09-01

    Subband-coded images can be transmitted in the Internet using either the TCP or the UDP protocol. Delivery by TCP gives superior decoding quality but with very long delays when the network is unreliable, whereas delivery by UDP has negligible delays but with degraded quality when packets are lost. Although images are delivered currently over the Internet by TCP, we study in this paper the use of UDP to deliver multi-description reconstruction-based subband-coded images. First, in order to facilitate recovery from UDP packet losses, we propose a joint sender-receiver approach for designing optimized reconstruction-based subband transform (ORB-ST) in multi-description coding (MDC). Second, we carefully evaluate the delay-quality trade-offs between the TCP delivery of SDC images and the UDP and combined TCP/UDP delivery of MDC images. Experimental results show that our proposed ORB-ST performs well in real Internet tests, and UDP and combined TCP/UDP delivery of MDC images provide a range of attractive alternatives to TCP delivery.

  11. Learning to Act Like a Lawyer: A Model Code of Professional Responsibility for Law Students

    Directory of Open Access Journals (Sweden)

    David M. Tanovich

    2009-02-01

    Full Text Available Law students are the future of the legal profession. How well prepared are they when they leave law school to assume the professional and ethical obligations that they owe themselves, the profession and the public? This question has led to a growing interest in Canada in the teaching of legal ethics. It is also led to a greater emphasis on the development of clinical and experiential learning as exemplified in the scholarship and teaching of Professor Rose Voyvodic. Less attention, however, has been placed on identifying the general ethical responsibilities of law students when not working in a clinic or other legal context. This can be seen in the presence of very few Canadian articles exploring the issue, and more significantly, in the paucity of law school discipline policies or codes of conduct that set out the professional obligations owed by law students. This article develops an idea that Professor Voyvodic and I talked about on a number of occasions. It argues that all law schools should have a code of conduct which is separate and distinct from their general University code and which resembles, with appropriate modifications, the relevant set of rules of professional responsibility law students will be bound by when called to the Bar. A student code of conduct which educates law students about their professional obligations is an important step in deterring such conduct while in law school and preparing students for ethical practice. The idea of a law school code of professional responsibility raises a number of questions. Why is it necessary for law schools to have their own student code of conduct? The article provides a threefold response. First, law students are members of the legal profession and a code of conduct should reflect this. Second, it must be relevant and comprehensive in order to ensure that it can inspire students to be ethical lawyers. And, third, as a practical matter, the last few years have witnessed a number of

  12. Unsupervised clustering with spiking neurons by sparse temporal coding and multi-layer RBF networks

    NARCIS (Netherlands)

    S.M. Bohte (Sander); J.A. La Poutré (Han); J.N. Kok (Joost)

    2000-01-01

    textabstractWe demonstrate that spiking neural networks encoding information in spike times are capable of computing and learning clusters from realistic data. We show how a spiking neural network based on spike-time coding and Hebbian learning can successfully perform unsupervised clustering on

  13. Using Quick Response Codes in the Classroom: Quality Outcomes.

    Science.gov (United States)

    Zurmehly, Joyce; Adams, Kellie

    2017-10-01

    With smart device technology emerging, educators are challenged with redesigning teaching strategies using technology to allow students to participate dynamically and provide immediate answers. To facilitate integration of technology and to actively engage students, quick response codes were included in a medical surgical lecture. Quick response codes are two-dimensional square patterns that enable the coding or storage of more than 7000 characters that can be accessed via a quick response code scanning application. The aim of this quasi-experimental study was to explore quick response code use in a lecture and measure students' satisfaction (met expectations, increased interest, helped understand, and provided practice and prompt feedback) and engagement (liked most, liked least, wanted changed, and kept involved), assessed using an investigator-developed instrument. Although there was no statistically significant correlation of quick response use to examination scores, satisfaction scores were high, and there was a small yet positive association between how students perceived their learning with quick response codes and overall examination scores. Furthermore, on open-ended survey questions, students responded that they were satisfied with the use of quick response codes, appreciated the immediate feedback, and planned to use them in the clinical setting. Quick response codes offer a way to integrate technology into the classroom to provide students with instant positive feedback.

  14. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  15. A restructuring of CF package for MIDAS computer code

    International Nuclear Information System (INIS)

    Park, S. H.; Kim, K. R.; Kim, D. H.; Cho, S. W.

    2004-01-01

    CF package, which evaluates user-specified 'control functions' and applies them to define or control various aspects of computation, has been restructured for the MIDAS computer code. MIDAS is being developed as an integrated severe accident analysis code with a user-friendly graphical user interface and modernized data structure. To do this, data transferring methods of current MELCOR code are modified and adopted into the CF package. The data structure of the current MELCOR code using FORTRAN77 causes a difficult grasping of meaning of the variables as well as waste of memory, difficulty is more over because its data is location information of other package's data due to characteristics of CF package. New features of FORTRAN90 make it possible to allocate the storage dynamically and to use the user-defined data type, which lead to an efficient memory treatment and an easy understanding of the code. Restructuring of the CF package addressed in this paper includes module development, subroutine modification, and treats MELGEN, which generates data file, as well as MELCOR, which is processing a calculation. The verification has been done by comparing the results of the modified code with those from the existing code. As the trends are similar to each other, it hints that the same approach could be extended to the entire code package. It is expected that code restructuring will accelerate the code domestication thanks to direct understanding of each variable and easy implementation of modified or newly developed models

  16. Pre-Service Teachers' Perception of Quick Response (QR) Code Integration in Classroom Activities

    Science.gov (United States)

    Ali, Nagla; Santos, Ieda M.; Areepattamannil, Shaljan

    2017-01-01

    Quick Response (QR) codes have been discussed in the literature as adding value to teaching and learning. Despite their potential in education, more research is needed to inform practice and advance knowledge in this field. This paper investigated the integration of the QR code in classroom activities and the perceptions of the integration by…

  17. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Control modules C4, C6

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U. S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume is part of the manual related to the control modules for the newest updated version of this computational package.

  18. MDEP Technical Report TR-CSWG-01. Technical Report: Regulatory Frameworks for the Use of Nuclear Pressure Boundary Codes and Standards in MDEP Countries

    International Nuclear Information System (INIS)

    2013-01-01

    The Codes and Standards Working Group (CSWG) is one of the issue-specific working groups that the MDEP members are undertaking; its long term goal is harmonisation of regulatory and code requirements for design and construction of pressure-retaining components in order to improve the effectiveness and efficiency of the regulatory design reviews, increase quality of safety assessments, and to enable each regulator to become stronger in its ability to make safety decisions. The CSWG has interacted closely with the Standards Development Organisations (SDOs) and CORDEL in code comparison and code convergence. The Code Comparison Report STP-NU-051 has been issued by SDO members to identify the extent of similarities and differences amongst the pressure-boundary codes and standards used in various countries. Besides the differences in codes and standards, the way how the codes and standards are applied to systems, structures and components also affects the design and construction of nuclear power plant. Therefore, to accomplish the goal of potential harmonisation, it is also vital that the regulators learn about each other's procedures, processes, and regulations. To facilitate the learning process, the CSWG meets regularly to discuss issues relevant to licensing new reactors and using codes and standards in licensing safety reviews. The CSWG communicates very frequently with the SDOs to discuss similarities and differences among the various codes and how to proceed with potential harmonisation. It should be noted that the IAEA is invited to all of the issue-specific working groups within MDEP to ensure consistency with IAEA standards. The primary focus of this technical report is to consolidate information shared and accomplishments achieved by the member countries. This report seeks to document how each MDEP regulator utilises national or regional mechanical codes and standards in its safety reviews and licensing of new reactors. The preparation of this report

  19. Influence of Action-Effect Associations Acquired by Ideomotor Learning on Imitation

    Science.gov (United States)

    Bunlon, Frédérique; Marshall, Peter J.; Quandt, Lorna C.; Bouquet, Cedric A.

    2015-01-01

    According to the ideomotor theory, actions are represented in terms of their perceptual effects, offering a solution for the correspondence problem of imitation (how to translate the observed action into a corresponding motor output). This effect-based coding of action is assumed to be acquired through action-effect learning. Accordingly, performing an action leads to the integration of the perceptual codes of the action effects with the motor commands that brought them about. While ideomotor theory is invoked to account for imitation, the influence of action-effect learning on imitative behavior remains unexplored. In two experiments, imitative performance was measured in a reaction time task following a phase of action-effect acquisition. During action-effect acquisition, participants freely executed a finger movement (index or little finger lifting), and then observed a similar (compatible learning) or a different (incompatible learning) movement. In Experiment 1, finger movements of left and right hands were presented as action-effects during acquisition. In Experiment 2, only right-hand finger movements were presented during action-effect acquisition and in the imitation task the observed hands were oriented orthogonally to participants’ hands in order to avoid spatial congruency effects. Experiments 1 and 2 showed that imitative performance was improved after compatible learning, compared to incompatible learning. In Experiment 2, although action-effect learning involved perception of finger movements of right hand only, imitative capabilities of right- and left-hand finger movements were equally affected. These results indicate that an observed movement stimulus processed as the effect of an action can later prime execution of that action, confirming the ideomotor approach to imitation. We further discuss these findings in relation to previous studies of action-effect learning and in the framework of current ideomotor approaches to imitation. PMID:25793755

  20. A proto-code of ethics and conduct for European nurse directors.

    Science.gov (United States)

    Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena

    2012-03-01

    The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.

  1. Calculation code NIRVANA for free boundary MHD equilibrium

    International Nuclear Information System (INIS)

    Ninomiya, Hiromasa; Suzuki, Yasuo; Kameari, Akihisa

    1975-03-01

    The calculation method and code of solving the free boundary problem for MHD equilibrium has been developed. Usage of the code ''NIRVANA'' is described. The toroidal plasma current density determined as a function of the flux function PSI is substituted by a group of the ring currents, whereby the equation of MHD equilibrium is transformed into an integral equation. Either of the two iterative methods is chosen to solve the integral equation, depending on the assumptions made of the plasma surface points. Calculation of the magnetic field configurations is possible when the plasma surface coincides self-consistently with the magnetic flux including the separatrix points. The code is usable in calculation of the circular or non-circular shell-less Tokamak equilibrium. (auth.)

  2. Computer code determination of tolerable accel current and voltage limits during startup of an 80 kV MFTF sustaining neutral beam source

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Eckard, R.D.

    1979-01-01

    We have used a Lawrence Livermore Laboratory (LLL) version of the WOLF ion source extractor design computer code to determine tolerable accel current and voltage limits during startup of a prototype 80 kV Mirror Fusion Test Facility (MFTF) sustaining neutral beam source. Arc current limits are also estimated. The source extractor has gaps of 0.236, 0.721, and 0.155 cm. The effective ion mass is 2.77 AMU. The measured optimum accel current density is 0.266 A/cm 2 . The gradient grid electrode runs at 5/6 V/sub a/ (accel voltage). The suppressor electrode voltage is zero for V/sub a/ < 3 kV and -3 kV for V/sub a/ greater than or equal to 3 kV. The accel current density for optimum beam divergence is obtained for 1 less than or equal to V/sub a/ less than or equal to 80 kV, as are the beam divergence and emittance

  3. Teachers’ Learning Design Practice for Students as Learning Designers

    DEFF Research Database (Denmark)

    Levinsen, Karin Tweddell; Sørensen, Birgitte Holm

    2018-01-01

    This paper contributes with elements of an emerging learning design methodology. The paper takes as its starting point the theory of Students as Learning Designers, which was developed by Sørensen and Levinsen and based on more than a decade of research-and-development projects in Danish primary...... schools (first to 10th grade). The research focussed on information and communication technology (ICT) within the Scandinavian tradition of Problem Oriented Project Pedagogy (POPP), Problem Based Learning (PBL) and students’ production. In recent years, the projects that provide the grounding...... for the theory have focussed specifically on learning designs that constitute students as learning designers of digital productions (both multimodal and coded productions). This includes learning designs that contribute to students’ empowerment, involvement and autonomy within the teacher-designed frameworks...

  4. A comprehensive study of sparse codes on abnormality detection

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2017-01-01

    Sparse representation has been applied successfully in abnor-mal event detection, in which the baseline is to learn a dic-tionary accompanied by sparse codes. While much empha-sis is put on discriminative dictionary construction, there areno comparative studies of sparse codes regarding abnormal-ity...... detection. We comprehensively study two types of sparsecodes solutions - greedy algorithms and convex L1-norm so-lutions - and their impact on abnormality detection perfor-mance. We also propose our framework of combining sparsecodes with different detection methods. Our comparative ex-periments are carried...

  5. Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project

    Science.gov (United States)

    Bolstad, Rachel

    2016-01-01

    This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…

  6. Final Technical Report: Hydrogen Codes and Standards Outreach

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  7. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  8. Teacher feedback during active learning: current practices in primary schools.

    Science.gov (United States)

    van den Bergh, Linda; Ros, Anje; Beijaard, Douwe

    2013-06-01

    Feedback is one of the most powerful tools, which teachers can use to enhance student learning. It appears difficult for teachers to give qualitatively good feedback, especially during active learning. In this context, teachers should provide facilitative feedback that is focused on the development of meta-cognition and social learning. The purpose of the present study is to contribute to the existing knowledge about feedback and to give directions to improve teacher feedback in the context of active learning. The participants comprised 32 teachers who practiced active learning in the domain of environmental studies in the sixth, seventh, or eighth grade of 13 Dutch primary schools. A total of 1,465 teacher-student interactions were examined. Video observations were made of active learning lessons in the domain of environmental studies. A category system was developed based on the literature and empirical data. Teacher-student interactions were assessed using this system. Results. About half of the teacher-student interactions contained feedback. This feedback was usually focused on the tasks that were being performed by the students and on the ways in which these tasks were processed. Only 5% of the feedback was explicitly related to a learning goal. In their feedback, the teachers were directing (rather than facilitating) the learning processes. During active learning, feedback on meta-cognition and social learning is important. Feedback should be explicitly related to learning goals. In practice, these kinds of feedback appear to be scarce. Therefore, giving feedback during active learning seems to be an important topic for teachers' professional development. © 2012 The British Psychological Society.

  9. Quality indicators for learner-centered postgraduate medical e-learning.

    Science.gov (United States)

    de Leeuw, Robert A; Westerman, Michiel; Scheele, Fedde

    2017-04-27

    The objectives of this study were to identify the needs and expectations of learners and educational experts in postgraduate medical e-learning, and to contribute to the current literature. We performed four focus-group discussions with e-learning end-users (learners) and didactic experts. The participants were postgraduate learners with varying levels of experience, educational experts from a Dutch e-learning task group, and commercial experts from a Dutch e-learning company. Verbatim transcribed interview recordings were analyzed using King's template analysis. The initial template was created with reference to recent literature on postgraduate medical e-learning quality indicators. The transcripts were coded, after which the emerging differences in template interpretation were discussed until a consensus was reached within the team. The final template consisted of three domains of positive e-learning influencers (motivators, learning enhancers, and real-world translation) and three domains of negatively influential parameters (barriers, learning discouragers, and poor preparation). The interpretation of the final template showed three subjects which form the basis of e-learning, namely, Motivate, Learn and Apply. This study forms a basis for learning in general and could be applied to many educational instruments. Individual characteristics should be adapted to the target audience. Three subjects form the basis of, and six themes cover all items needed for, good (enough) postgraduate e-learning. Further research should be carried out with learners and real-world e-learning to validate this template.

  10. Investigating the Simulink Auto-Coding Process

    Science.gov (United States)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of

  11. A restructuring of RN1 package for MIDAS computer code

    International Nuclear Information System (INIS)

    Park, S. H.; Kim, D. H.; Kim, K. R.

    2003-01-01

    RN1 package, which is one of two fission product-related packages in MELCOR, has been restructured for the MIDAS computer code. MIDAS is being developed as an integrated severe accident analysis code with a user-friendly graphical user interface and modernized data structure. To do this, data transferring methods of current MELCOR code are modified and adopted into the RN1 package. The data structure of the current MELCOR code using FORTRAN77 causes a difficult grasping of meaning of the variables as well as waste of memory. New features of FORTRAN90 make it possible to allocate the storage dynamically and to use the user-defined data type, which lead to an efficient memory treatment and an easy understanding of the code. Restructuring of the RN1 package addressed in this paper includes module development, subroutine modification, and treats MELGEN, which generates data file, as well as MELCOR, which is processing a calculation. The verification has been done by comparing the results of the modified code with those from the existing code. As the trends are similar to each other, it hints that the same approach could be extended to the entire code package. It is expected that code restructuring will accelerate the code domestication thanks to direct understanding of each variable and easy implementation of modified or newly developed models

  12. A restructuring of RN2 package for MIDAS computer code

    International Nuclear Information System (INIS)

    Park, S. H.; Kim, D. H.

    2003-01-01

    RN2 package, which is one of two fission product-related package in MELCOR, has been restructured for the MIDAS computer code. MIDAS is being developed as an integrated severe accident analysis code with a user-friendly graphical user interface and data structure. To do this, data transferring methods of current MELCOR code are modified and adopted into the RN2 package. The data structure of the current MELCOR code using FORTRAN77 causes a difficult grasping of meaning of the variables as well as waste of memory. New features of FORTRAN90 make it possible to allocate the storage dynamically and to use the user-defined data type, which lead to an efficient memory treatment and an easy understanding of the code. Restructuring of the RN2 package addressed in this paper includes module development, subroutine modification, and treats MELGEN, which generates data file, as well as MELCOR, which is processing a calculation. The validation has been done by comparing the results of the modified code with those from the existing code. As the trends are the similar to each other, it hints that the same approach could be extended to the entire code package. It is expected that code restructuring will accelerate the code domestication thanks to direct understanding of each variable and easy implementation of modified or newly developed models

  13. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  14. Combined RF current drive and bootstrap current in tokamaks

    International Nuclear Information System (INIS)

    Schultz, S. D.; Bers, A.; Ram, A. K.

    1999-01-01

    By calculating radio frequency current drive (RFCD) and the bootstrap current in a consistent kinetic manner, we find synergistic effects in the total noninductive current density in tokamaks [1]. We include quasilinear diffusion in the Drift Kinetic Equation (DKE) in order to generalize neoclassical theory to highly non-Maxwellian electron distributions due to RFCD. The parallel plasma current is evaluated numerically with the help of the FASTEP Fokker-Planck code [2]. Current drive efficiency is found to be significantly affected by neoclassical effects, even in cases where only circulating electrons interact with the waves. Predictions of the current drive efficiency are made for lower hybrid and electron cyclotron wave current drive scenarios in the presence of bootstrap current

  15. Blended learning in anesthesia education: current state and future model.

    Science.gov (United States)

    Kannan, Jaya; Kurup, Viji

    2012-12-01

    Educators in anesthesia residency programs across the country are facing a number of challenges as they attempt to integrate blended learning techniques in their curriculum. Compared with the rest of higher education, which has made advances to varying degrees in the adoption of online learning anesthesiology education has been sporadic in the active integration of blended learning. The purpose of this review is to discuss the challenges in anesthesiology education and relevance of the Universal Design for Learning framework in addressing them. There is a wide chasm between student demand for online education and the availability of trained faculty to teach. The design of the learning interface is important and will significantly affect the learning experience for the student. This review examines recent literature pertaining to this field, both in the realm of higher education in general and medical education in particular, and proposes the application of a comprehensive learning model that is new to anesthesiology education and relevant to its goals of promoting self-directed learning.

  16. BAR-MOM code and its application

    International Nuclear Information System (INIS)

    Wang Shunuan

    2002-01-01

    BAR-MOM code for calculating the height of the fission barrier Bf , the energy of the ground state is presented; the compound nucleus stability by limit with respect to fission, i.e., the angular momentum (the spin value) L max at which the fission barrier disappears, the three principal axis moments of inertia at saddle point for a certain nucleus with atomic number Z, atomic mass number A and angular momentum L in units of ℎ for 19< Z<102, and the model used are introduced briefly. The generalized BAR-MOM code to include the results for Z ≥ 102 by using more recent parameterization of the Thomas Fermi fission barrier is also introduced briefly. We have learned the models used in Code BAR-MOM, and run it successfully and correctly for a certain nucleus with atomic mass number A, atomic number Z, and angular momentum L on PC by Fortran-90. The testing calculation values to check the implementation of the program show that the results of the present work are in good agreement with the original one

  17. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes.

  18. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation: Functional modules, F9-F11

    International Nuclear Information System (INIS)

    1997-03-01

    This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This volume consists of the section of the manual dealing with three of the functional modules in the code. Those are the Morse-SGC for the SCALE system, Heating 7.2, and KENO V.a. The manual describes the latest released versions of the codes

  19. Current Status of Experimental and Theoretical Work on Sodium/Fuel Interaction (SFI) at Karlsruhe 'Code Developments'

    International Nuclear Information System (INIS)

    Beutel, H.; Bojarsky, E.; Reiser, H.; Caldarola, L.; Jacobs, H.; Zyszkowski, W.

    1976-01-01

    The theoretical work follows two main lines: A. Code development; B. Theoretical work on fragmentation. Two computer codes have been developed. The first code contains a heat transfer model (during the vaporization phase) based on the inverse Leidenfrost phenomenon (which has been observed experimentally in water). The exact solution of the heat diffusion equation in a sphere is included in the code. The code accounts for the time history of each fuel particle by means of specially averaged temperature values. The presence of fission gases can also be taken into account. A size distribution of fuel particles has also been incorporated in the code as well as the effect of the friction due to the channel walls and that of the pressure losses at channel outlet. An extensive parametric study has been carried out with this code. The main conclusions are the following: 1. Total mechanical work strongly decreases with the fragmentation and/or mixing time constants. 2. Vapour blanketing during the vaporization phase is effective only if accompanied by a relatively slow process of fragmentation and mixing. In this case total mechanical work strongly decreased with degree of vapour blanketing. 3. Total mechanical work rises with initial length of sodium piston. 4. Time to empty the 120 cm long channel is 15-20 msecs. for values of the fragmentation and/or mixing time constants of the order of 5-10 msecs. 5. Effects due to particle size distribution and gas content are important only fora rapid fragmentation and mixing process. It must be painted out that (as far as the gas is concerned) this conclusion is valid only within the limits of the effects (due to the gas) which have been considered in the model. Propagation effects can be analysed by using the second code. The interaction region can be subdivided into an arbitrary number of sections, each containing fuel and coolant. The thermal conductivity of the liquid sodium has also been taken into account, as well as the

  20. Association of Amine-Receptor DNA Sequence Variants with Associative Learning in the Honeybee.

    Science.gov (United States)

    Lagisz, Malgorzata; Mercer, Alison R; de Mouzon, Charlotte; Santos, Luana L S; Nakagawa, Shinichi

    2016-03-01

    Octopamine- and dopamine-based neuromodulatory systems play a critical role in learning and learning-related behaviour in insects. To further our understanding of these systems and resulting phenotypes, we quantified DNA sequence variations at six loci coding octopamine-and dopamine-receptors and their association with aversive and appetitive learning traits in a population of honeybees. We identified 79 polymorphic sequence markers (mostly SNPs and a few insertions/deletions) located within or close to six candidate genes. Intriguingly, we found that levels of sequence variation in the protein-coding regions studied were low, indicating that sequence variation in the coding regions of receptor genes critical to learning and memory is strongly selected against. Non-coding and upstream regions of the same genes, however, were less conserved and sequence variations in these regions were weakly associated with between-individual differences in learning-related traits. While these associations do not directly imply a specific molecular mechanism, they suggest that the cross-talk between dopamine and octopamine signalling pathways may influence olfactory learning and memory in the honeybee.

  1. SCDAP/RELAP5/MOD3 code development

    International Nuclear Information System (INIS)

    Allison, C.M.; Siefken, J.L.; Coryell, E.W.

    1992-01-01

    The SCOAP/RELAP5/MOD3 computer code is designed to describe the overall reactor coolant system (RCS) thermal-hydraulic response, core damage progression, and fission product release and transport during severe accidents. The code is being developed at the Idaho National Engineering Laboratory (INEL) under the primary sponsorship of the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC). Code development activities are currently focused on three main areas - (a) code usability, (b) early phase melt progression model improvements, and (c) advanced reactor thermal-hydraulic model extensions. This paper describes the first two activities. A companion paper describes the advanced reactor model improvements being performed under RELAP5/MOD3 funding

  2. A Synthesis of Language Learning Strategies: Current Issues, Problems and Claims Made in Learner Strategy Research

    Science.gov (United States)

    Barjesteh, Hamed; Mukundan, Jayakaran; Vaseghi, Reza

    2014-01-01

    The current paper presented theoretical assumptions behind language learning strategies (LLS) and an overview of methods used to identify learners' strategies, first, and then summarized what have been reported from large number of descriptive studies of strategies by language learners. Moreover, the paper tried to present the variety of…

  3. Does Service-Learning Increase Student Learning?: A Meta-Analysis

    Science.gov (United States)

    Warren, Jami L.

    2012-01-01

    Research studies reflect mixed results on whether or not service-learning increases student learning outcomes. The current study seeks to reconcile these findings by extending a meta-analysis conducted by Novak, Markey, and Allen (2007) in which these authors examined service-learning and student learning outcomes. In the current study, 11…

  4. A Life-Cycle Risk-Informed Systems Structured Nuclear Code

    International Nuclear Information System (INIS)

    Hill, Ralph S. III

    2002-01-01

    Current American Society of Mechanical Engineers (ASME) nuclear codes and standards rely primarily on deterministic and mechanistic approaches to design. The design code is a separate volume from the code for inservice inspections and both are separate from the standards for operations and maintenance. The ASME code for inservice inspections and code for nuclear plant operations and maintenance have adopted risk-informed methodologies for inservice inspection, preventive maintenance, and repair and replacement decisions. The American Institute of Steel Construction and the American Concrete Institute have incorporated risk-informed probabilistic methodologies into their design codes. It is proposed that the ASME nuclear code should undergo a planned evolution that integrates the various nuclear codes and standards and adopts a risk-informed approach across a facility life-cycle - encompassing design, construction, operation, maintenance and closure. (author)

  5. Relationship between various pressure vessel and piping codes

    International Nuclear Information System (INIS)

    Canonico, D.A.

    1976-01-01

    Section VIII of the ASME Code provides stress allowable values for material specifications that are provided in Section II Parts A and B. Since the adoption of the ASME Code over 60 years ago the incidence of failure has been greatly reduced. The Codes are currently based on strength criteria and advancements in the technology of fracture toughness and fracture mechanics should permit an even greater degree of reliability and safety. This lecture discusses the various Sections of the Code. It describes the basis for the establishment of design stress allowables and promotes the idea of the use of fracture mechanics

  6. Toward an Integration of Deep Learning and Neuroscience

    Science.gov (United States)

    Marblestone, Adam H.; Wayne, Greg; Kording, Konrad P.

    2016-01-01

    Neuroscience has focused on the detailed implementation of computation, studying neural codes, dynamics and circuits. In machine learning, however, artificial neural networks tend to eschew precisely designed codes, dynamics or circuits in favor of brute force optimization of a cost function, often using simple and relatively uniform initial architectures. Two recent developments have emerged within machine learning that create an opportunity to connect these seemingly divergent perspectives. First, structured architectures are used, including dedicated systems for attention, recursion and various forms of short- and long-term memory storage. Second, cost functions and training procedures have become more complex and are varied across layers and over time. Here we think about the brain in terms of these ideas. We hypothesize that (1) the brain optimizes cost functions, (2) the cost functions are diverse and differ across brain locations and over development, and (3) optimization operates within a pre-structured architecture matched to the computational problems posed by behavior. In support of these hypotheses, we argue that a range of implementations of credit assignment through multiple layers of neurons are compatible with our current knowledge of neural circuitry, and that the brain's specialized systems can be interpreted as enabling efficient optimization for specific problem classes. Such a heterogeneously optimized system, enabled by a series of interacting cost functions, serves to make learning data-efficient and precisely targeted to the needs of the organism. We suggest directions by which neuroscience could seek to refine and test these hypotheses. PMID:27683554

  7. Review of Current Student-Monitoring Techniques used in eLearning-Focused recommender Systems and Learning analytics. The Experience API & LIME model Case Study

    Directory of Open Access Journals (Sweden)

    Alberto Corbi

    2014-09-01

    Full Text Available Recommender systems require input information in order to properly operate and deliver content or behaviour suggestions to end users. eLearning scenarios are no exception. Users are current students and recommendations can be built upon paths (both formal and informal, relationships, behaviours, friends, followers, actions, grades, tutor interaction, etc. A recommender system must somehow retrieve, categorize and work with all these details. There are several ways to do so: from raw and inelegant database access to more curated web APIs or even via HTML scrapping. New server-centric user-action logging and monitoring standard technologies have been presented in past years by several groups, organizations and standard bodies. The Experience API (xAPI, detailed in this article, is one of these. In the first part of this paper we analyse current learner-monitoring techniques as an initialization phase for eLearning recommender systems. We next review standardization efforts in this area; finally, we focus on xAPI and the potential interaction with the LIME model, which will be also summarized below.

  8. Hello Ruby adventures in coding

    CERN Document Server

    Liukas, Linda

    2015-01-01

    "Code is the 21st century literacy and the need for people to speak the ABCs of Programming is imminent." --Linda Liukas Meet Ruby--a small girl with a huge imagination. In Ruby's world anything is possible if you put your mind to it. When her dad asks her to find five hidden gems Ruby is determined to solve the puzzle with the help of her new friends, including the Wise Snow Leopard, the Friendly Foxes, and the Messy Robots. As Ruby stomps around her world kids will be introduced to the basic concepts behind coding and programming through storytelling. Learn how to break big problems into small problems, repeat tasks, look for patterns, create step-by-step plans, and think outside the box. With hands-on activities included in every chapter, future coders will be thrilled to put their own imaginations to work.

  9. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  10. Evaluation of the DRAGON code for VHTR design analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-12

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by the IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.

  11. Evaluation of the DRAGON code for VHTR design analysis

    International Nuclear Information System (INIS)

    Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division

    2006-01-01

    This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by the IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR

  12. An On-Chip Learning Neuromorphic Autoencoder With Current-Mode Transposable Memory Read and Virtual Lookup Table.

    Science.gov (United States)

    Cho, Hwasuk; Son, Hyunwoo; Seong, Kihwan; Kim, Byungsub; Park, Hong-June; Sim, Jae-Yoon

    2018-02-01

    This paper presents an IC implementation of on-chip learning neuromorphic autoencoder unit in a form of rate-based spiking neural network. With a current-mode signaling scheme embedded in a 500 × 500 6b SRAM-based memory, the proposed architecture achieves simultaneous processing of multiplications and accumulations. In addition, a transposable memory read for both forward and backward propagations and a virtual lookup table are also proposed to perform an unsupervised learning of restricted Boltzmann machine. The IC is fabricated using 28-nm CMOS process and is verified in a three-layer network of encoder-decoder pair for training and recovery of images with two-dimensional pixels. With a dataset of 50 digits, the IC shows a normalized root mean square error of 0.078. Measured energy efficiencies are 4.46 pJ per synaptic operation for inference and 19.26 pJ per synaptic weight update for learning, respectively. The learning performance is also estimated by simulations if the proposed hardware architecture is extended to apply to a batch training of 60 000 MNIST datasets.

  13. Current deposition profiles in advanced geometries

    International Nuclear Information System (INIS)

    Wright, J.C.; Phillips, C.K.; Bonoli, P.T.

    1997-01-01

    In advanced toroidal devices, plasma shaping can have a significant effect on quantities of interest, including the radio frequency (RF) deposited power and current. Most 2D RF modeling codes use a parameterization of current drive efficiencies to calculate fast wave driven currents. This parameterization is derived from a ray-tracing model in a low-beta model equilibrium. There are difficulties in applying it to a spectrum of waves, and it cannot account for multiple resonances and coherency effects between the electrons and the waves. By evaluating a formulation of the quasilinear diffusion coefficient in an arbitrary inhomogenous geometry with the fields from a full wave code, we address the effects of wave spectra, plasma inhomogeneity, and plasma profile on the evaluation of current deposition profiles. Current profiles are calculated directly from the quasilinear diffusion using the adjoint formulation, with the magnetic equilibrium specified consistently in both the adjoint routine and the full wave code. Results are benchmarked by comparing a power deposition calculation from conductivity to one from the quasilinear expression. RF driven current profiles for various devices, including tokamaks with different aspect ratios, will be presented. copyright 1997 American Institute of Physics

  14. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  15. Additional extensions to the NASCAP computer code, volume 3

    Science.gov (United States)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  16. Politicas de uniformes y codigos de vestuario (Uniforms and Dress-Code Policies). ERIC Digest.

    Science.gov (United States)

    Lumsden, Linda

    This digest in Spanish examines schools' dress-code policies and discusses the legal considerations and research findings about the effects of such changes. Most revisions to dress codes involve the use of uniforms, typically as a way to curb school violence and create a positive learning environment. A recent survey of secondary school principals…

  17. Improved lossless intra coding for H.264/MPEG-4 AVC.

    Science.gov (United States)

    Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J

    2006-09-01

    A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.

  18. Computer codes in particle transport physics

    International Nuclear Information System (INIS)

    Pesic, M.

    2004-01-01

    Simulation of transport and interaction of various particles in complex media and wide energy range (from 1 MeV up to 1 TeV) is very complicated problem that requires valid model of a real process in nature and appropriate solving tool - computer code and data library. A brief overview of computer codes based on Monte Carlo techniques for simulation of transport and interaction of hadrons and ions in wide energy range in three dimensional (3D) geometry is shown. Firstly, a short attention is paid to underline the approach to the solution of the problem - process in nature - by selection of the appropriate 3D model and corresponding tools - computer codes and cross sections data libraries. Process of data collection and evaluation from experimental measurements and theoretical approach to establishing reliable libraries of evaluated cross sections data is Ion g, difficult and not straightforward activity. For this reason, world reference data centers and specialized ones are acknowledged, together with the currently available, state of art evaluated nuclear data libraries, as the ENDF/B-VI, JEF, JENDL, CENDL, BROND, etc. Codes for experimental and theoretical data evaluations (e.g., SAMMY and GNASH) together with the codes for data processing (e.g., NJOY, PREPRO and GRUCON) are briefly described. Examples of data evaluation and data processing to generate computer usable data libraries are shown. Among numerous and various computer codes developed in transport physics of particles, the most general ones are described only: MCNPX, FLUKA and SHIELD. A short overview of basic application of these codes, physical models implemented with their limitations, energy ranges of particles and types of interactions, is given. General information about the codes covers also programming language, operation system, calculation speed and the code availability. An example of increasing computation speed of running MCNPX code using a MPI cluster compared to the code sequential option

  19. Teaching Billing and Coding to Medical Students: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Jiaxin Tran

    2013-08-01

    Full Text Available Complex billing practices cost the US healthcare system billions of dollars annually. Coding for outpatient office visits [known as Evaluation & Management (E&M services] is commonly particularly fraught with errors. The best way to insure proper billing and coding by practicing physicians is to teach this as part of the medical school curriculum. Here, in a pilot study, we show that medical students can learn well the basic principles from lectures. This approach is easy to implement into a medical school curriculum.

  20. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks.

    Science.gov (United States)

    Zenke, Friedemann; Ganguli, Surya

    2018-04-13

    A vast majority of computation in the brain is performed by spiking neural networks. Despite the ubiquity of such spiking, we currently lack an understanding of how biological spiking neural circuits learn and compute in vivo, as well as how we can instantiate such capabilities in artificial spiking circuits in silico. Here we revisit the problem of supervised learning in temporally coding multilayer spiking neural networks. First, by using a surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns. Second, inspired by recent results on feedback alignment, we compare the performance of our learning rule under different credit assignment strategies for propagating output errors to hidden units. Specifically, we test uniform, symmetric, and random feedback, finding that simpler tasks can be solved with any type of feedback, while more complex tasks require symmetric feedback. In summary, our results open the door to obtaining a better scientific understanding of learning and computation in spiking neural networks by advancing our ability to train them to solve nonlinear problems involving transformations between different spatiotemporal spike time patterns.

  1. Cathodal Transcranial Direct Current Stimulation Over Left Dorsolateral Prefrontal Cortex Area Promotes Implicit Motor Learning in a Golf Putting Task.

    Science.gov (United States)

    Zhu, Frank F; Yeung, Andrew Y; Poolton, Jamie M; Lee, Tatia M C; Leung, Gilberto K K; Masters, Rich S W

    2015-01-01

    Implicit motor learning is characterized by low dependence on working memory and stable performance despite stress, fatigue, or multi-tasking. However, current paradigms for implicit motor learning are based on behavioral interventions that are often task-specific and limited when applied in practice. To investigate whether cathodal transcranial direct current stimulation (tDCS) over the left dorsolateral prefrontal cortex (DLPFC) area during motor learning suppressed working memory activity and reduced explicit verbal-analytical involvement in movement control, thereby promoting implicit motor learning. Twenty-seven healthy individuals practiced a golf putting task during a Training Phase while receiving either real cathodal tDCS stimulation over the left DLPFC area or sham stimulation. Their performance was assessed during a Test phase on another day. Verbal working memory capacity was assessed before and after the Training Phase, and before the Test Phase. Compared to sham stimulation, real stimulation suppressed verbal working memory activity after the Training Phase, but enhanced golf putting performance during the Training Phase and the Test Phase, especially when participants were required to multi-task. Cathodal tDCS over the left DLPFC may foster implicit motor learning and performance in complex real-life motor tasks that occur during sports, surgery or motor rehabilitation. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  3. A simple in-surge pressure analysis using the SPACE code

    International Nuclear Information System (INIS)

    Youn, Bum Soo; Kim, Yo Han; Lee, Dong Hyuk; Yang, Chang Keun; Kim, Se Yun; Ha, Sang Jun

    2010-01-01

    Currently, nuclear safety analysis codes used in Korea are developed by all the overseas. These codes are paying huge fee and permission must be obtained for use in the country. In addition, orders for nuclear power plants must ensure the safety analysis code for independent domestic technology. Therefore, Korea Electric Power Research Institute(KEPRI) is developing the domestic nuclear power safety analysis, SPACE(Safety and Performance Analysis Code for nuclear power plants). To determine the computational power of pressurizer model in development SPACE code, it was compared with existing commercial nuclear power safety analysis code, RETRAN

  4. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  5. Current algorithms used in reactor safety codes and the impact of future computer development on these algorithms

    International Nuclear Information System (INIS)

    Mahaffy, J.H.; Liles, D.R.; Woodruff, S.B.

    1985-01-01

    Computational methods and solution procedures used in the US Nuclear Regulatory Commission's reactor safety systems codes, Transient Reactor Analysis Code (TRAC) and Reactor Leak and Power Safety Excursion Code (RELAP), are reviewed. Methods used in TRAC-PF1/MOD1, including the stability-enhancing two-step (SETS) technique, which permits fast computations by allowing time steps larger than the material Courant stability limit, are described in detail, and the differences from RELAP5/MOD2 are noted. Developments in computing, including parallel and vector processing, and their applicability to nuclear reactor safety codes are described. These developments, coupled with appropriate numerical methods, make detailed faster-than-real-time reactor safety analysis a realistic near-term possibility

  6. Libre knowledge, libre learning and global development

    CSIR Research Space (South Africa)

    Tucker, KC

    2006-10-01

    Full Text Available Rishab Ghosh     48 Findings: formal learning In comparison with formal ICT courses: ● FLOSS provides a better, practical learning  environment for many technical skills: – Writing re­usable code & debugging – Working... run and maintain complex software systems Basic / introductory programming skills To look for and fix bugs To become familiar with different programming languages To write code in a way that it can be re­ used To design modular code To document code To create new algorithms a...

  7. High-speed architecture for the decoding of trellis-coded modulation

    Science.gov (United States)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  8. Surprised at all the entropy: hippocampal, caudate and midbrain contributions to learning from prediction errors.

    Directory of Open Access Journals (Sweden)

    Anne-Marike Schiffer

    Full Text Available Influential concepts in neuroscientific research cast the brain a predictive machine that revises its predictions when they are violated by sensory input. This relates to the predictive coding account of perception, but also to learning. Learning from prediction errors has been suggested for take place in the hippocampal memory system as well as in the basal ganglia. The present fMRI study used an action-observation paradigm to investigate the contributions of the hippocampus, caudate nucleus and midbrain dopaminergic system to different types of learning: learning in the absence of prediction errors, learning from prediction errors, and responding to the accumulation of prediction errors in unpredictable stimulus configurations. We conducted analyses of the regions of interests' BOLD response towards these different types of learning, implementing a bootstrapping procedure to correct for false positives. We found both, caudate nucleus and the hippocampus to be activated by perceptual prediction errors. The hippocampal responses seemed to relate to the associative mismatch between a stored representation and current sensory input. Moreover, its response was significantly influenced by the average information, or Shannon entropy of the stimulus material. In accordance with earlier results, the habenula was activated by perceptual prediction errors. Lastly, we found that the substantia nigra was activated by the novelty of sensory input. In sum, we established that the midbrain dopaminergic system, the hippocampus, and the caudate nucleus were to different degrees significantly involved in the three different types of learning: acquisition of new information, learning from prediction errors and responding to unpredictable stimulus developments. We relate learning from perceptual prediction errors to the concept of predictive coding and related information theoretic accounts.

  9. Surprised at all the entropy: hippocampal, caudate and midbrain contributions to learning from prediction errors.

    Science.gov (United States)

    Schiffer, Anne-Marike; Ahlheim, Christiane; Wurm, Moritz F; Schubotz, Ricarda I

    2012-01-01

    Influential concepts in neuroscientific research cast the brain a predictive machine that revises its predictions when they are violated by sensory input. This relates to the predictive coding account of perception, but also to learning. Learning from prediction errors has been suggested for take place in the hippocampal memory system as well as in the basal ganglia. The present fMRI study used an action-observation paradigm to investigate the contributions of the hippocampus, caudate nucleus and midbrain dopaminergic system to different types of learning: learning in the absence of prediction errors, learning from prediction errors, and responding to the accumulation of prediction errors in unpredictable stimulus configurations. We conducted analyses of the regions of interests' BOLD response towards these different types of learning, implementing a bootstrapping procedure to correct for false positives. We found both, caudate nucleus and the hippocampus to be activated by perceptual prediction errors. The hippocampal responses seemed to relate to the associative mismatch between a stored representation and current sensory input. Moreover, its response was significantly influenced by the average information, or Shannon entropy of the stimulus material. In accordance with earlier results, the habenula was activated by perceptual prediction errors. Lastly, we found that the substantia nigra was activated by the novelty of sensory input. In sum, we established that the midbrain dopaminergic system, the hippocampus, and the caudate nucleus were to different degrees significantly involved in the three different types of learning: acquisition of new information, learning from prediction errors and responding to unpredictable stimulus developments. We relate learning from perceptual prediction errors to the concept of predictive coding and related information theoretic accounts.

  10. The Current Status of E-learning and Strategies to Enhance Educational Competitiveness in Korean Higher Education

    Directory of Open Access Journals (Sweden)

    Junghoon Leem

    2007-03-01

    Full Text Available The purpose of this study was to examine the current status of e-Learning in Korean higher education and find ways to encourage the further use and development of e-Learning systems that aim to enhance Korea's academic competitiveness. A total of 201 universities in Korea (27 national and public, 163 private, and 11 national universities of education were examined in this study. At the time of the study, 85 percent of the universities and colleges had investigated implementing e-Learning. There were special e-Learning teams in most national and public universities, as well as private universities and colleges. Findings from this study found that both teachers and learners alike, lacked meaningful support systems and opportunities to actively participate in e-Learning programs. Although such lack of support was found to be endemic, such lack of support and opportunity was found to be more accute in private universities, private colleges, universities of education, than mid-sized, small-sized, and provincial universities and colleges. Except for a few mid- and small-sized universities and colleges, most large universities and colleges were equipped with technical support such as infrastructure and operational platforms. These same schools, however, did not provide institutional support, nor did they employ appropriate policies needed to further the quality and enhancement of e-Learning offerings. Also, there was no meaningful link found between schools and industry, nor was there adequate financial support in place for the implementation of e-Learning systems, simply because many universities failed to allocate sufficient funding for e-Learning.In conclusion, the strategies for enhancing university competitiveness through e-Learning are as follows: 1 establishing support strategies according to the types of universities; 2 developing quality assurance systems for e-Learning; 3 enhancing support systems for professors and learners; 4 developing

  11. Sequence Coding and Search System for licensee event reports: code listings. Volume 2

    International Nuclear Information System (INIS)

    Gallaher, R.B.; Guymon, R.H.; Mays, G.T.; Poore, W.P.; Cagle, R.J.; Harrington, K.H.; Johnson, M.P.

    1985-04-01

    Operating experience data from nuclear power plants are essential for safety and reliability analyses, especially analyses of trends and patterns. The licensee event reports (LERs) that are submitted to the Nuclear Regulatory Commission (NRC) by the nuclear power plant utilities contain much of this data. The NRC's Office for Analysis and Evaluation of Operational Data (AEOD) has developed, under contract with NSIC, a system for codifying the events reported in the LERs. The primary objective of the Sequence Coding and Search System (SCSS) is to reduce the descriptive text of the LERs to coded sequences that are both computer-readable and computer-searchable. This system provides a structured format for detailed coding of component, system, and unit effects as well as personnel errors. The database contains all current LERs submitted by nuclear power plant utilities for events occurring since 1981 and is updated on a continual basis. Volume 2 contains all valid and acceptable codes used for searching and encoding the LER data. This volume contains updated material through amendment 1 to revision 1 of the working version of ORNL/NSIC-223, Vol. 2

  12. Development of a new EMP code at LANL

    Science.gov (United States)

    Colman, J. J.; Roussel-Dupré, R. A.; Symbalisty, E. M.; Triplett, L. A.; Travis, B. J.

    2006-05-01

    A new code for modeling the generation of an electromagnetic pulse (EMP) by a nuclear explosion in the atmosphere is being developed. The source of the EMP is the Compton current produced by the prompt radiation (γ-rays, X-rays, and neutrons) of the detonation. As a first step in building a multi- dimensional EMP code we have written three kinetic codes, Plume, Swarm, and Rad. Plume models the transport of energetic electrons in air. The Plume code solves the relativistic Fokker-Planck equation over a specified energy range that can include ~ 3 keV to 50 MeV and computes the resulting electron distribution function at each cell in a two dimensional spatial grid. The energetic electrons are allowed to transport, scatter, and experience Coulombic drag. Swarm models the transport of lower energy electrons in air, spanning 0.005 eV to 30 keV. The swarm code performs a full 2-D solution to the Boltzmann equation for electrons in the presence of an applied electric field. Over this energy range the relevant processes to be tracked are elastic scattering, three body attachment, two body attachment, rotational excitation, vibrational excitation, electronic excitation, and ionization. All of these occur due to collisions between the electrons and neutral bodies in air. The Rad code solves the full radiation transfer equation in the energy range of 1 keV to 100 MeV. It includes effects of photo-absorption, Compton scattering, and pair-production. All of these codes employ a spherical coordinate system in momentum space and a cylindrical coordinate system in configuration space. The "z" axis of the momentum and configuration spaces is assumed to be parallel and we are currently also assuming complete spatial symmetry around the "z" axis. Benchmarking for each of these codes will be discussed as well as the way forward towards an integrated modern EMP code.

  13. DNA Barcoding through Quaternary LDPC Codes.

    Science.gov (United States)

    Tapia, Elizabeth; Spetale, Flavio; Krsticevic, Flavia; Angelone, Laura; Bulacio, Pilar

    2015-01-01

    For many parallel applications of Next-Generation Sequencing (NGS) technologies short barcodes able to accurately multiplex a large number of samples are demanded. To address these competitive requirements, the use of error-correcting codes is advised. Current barcoding systems are mostly built from short random error-correcting codes, a feature that strongly limits their multiplexing accuracy and experimental scalability. To overcome these problems on sequencing systems impaired by mismatch errors, the alternative use of binary BCH and pseudo-quaternary Hamming codes has been proposed. However, these codes either fail to provide a fine-scale with regard to size of barcodes (BCH) or have intrinsic poor error correcting abilities (Hamming). Here, the design of barcodes from shortened binary BCH codes and quaternary Low Density Parity Check (LDPC) codes is introduced. Simulation results show that although accurate barcoding systems of high multiplexing capacity can be obtained with any of these codes, using quaternary LDPC codes may be particularly advantageous due to the lower rates of read losses and undetected sample misidentification errors. Even at mismatch error rates of 10(-2) per base, 24-nt LDPC barcodes can be used to multiplex roughly 2000 samples with a sample misidentification error rate in the order of 10(-9) at the expense of a rate of read losses just in the order of 10(-6).

  14. DNA Barcoding through Quaternary LDPC Codes.

    Directory of Open Access Journals (Sweden)

    Elizabeth Tapia

    Full Text Available For many parallel applications of Next-Generation Sequencing (NGS technologies short barcodes able to accurately multiplex a large number of samples are demanded. To address these competitive requirements, the use of error-correcting codes is advised. Current barcoding systems are mostly built from short random error-correcting codes, a feature that strongly limits their multiplexing accuracy and experimental scalability. To overcome these problems on sequencing systems impaired by mismatch errors, the alternative use of binary BCH and pseudo-quaternary Hamming codes has been proposed. However, these codes either fail to provide a fine-scale with regard to size of barcodes (BCH or have intrinsic poor error correcting abilities (Hamming. Here, the design of barcodes from shortened binary BCH codes and quaternary Low Density Parity Check (LDPC codes is introduced. Simulation results show that although accurate barcoding systems of high multiplexing capacity can be obtained with any of these codes, using quaternary LDPC codes may be particularly advantageous due to the lower rates of read losses and undetected sample misidentification errors. Even at mismatch error rates of 10(-2 per base, 24-nt LDPC barcodes can be used to multiplex roughly 2000 samples with a sample misidentification error rate in the order of 10(-9 at the expense of a rate of read losses just in the order of 10(-6.

  15. RELAP5/MOD2 code assessment

    International Nuclear Information System (INIS)

    Nithianandan, C.K.; Shah, N.H.; Schomaker, R.J.; Miller, F.R.

    1985-01-01

    Babcock and Wilcox (B and W) has been working with the code developers at EG and G and the US Nuclear Regulatory Commission in assessing the RELAP5/MOD2 computer code for the past year by simulating selected separate-effects tests. The purpose of this assessment has been to evaluate the code for use in MIST (Ref. 2) and OTIS integral system tests simulations and in the prediction of pressurized water reactor transients. B and W evaluated various versions of the code and made recommendations to improve code performance. As a result, the currently released version (cycle 36.1) has been improved considerably over earlier versions. However, further refinements to some of the constitutive models may still be needed to further improve the predictive capability of RELAP5/MOD2. The following versions of the code were evaluated. (1) RELAP/MOD2/Cycle 22 - first released version; (2) YELAP5/Cycle 32 - EG and G test version of RELAP5/MOD2/Cycle 32; (3) RELAP5/MOD2/Cycle 36 - frozen cycle for international code assessment; (4) updates to cycle 36 based on recommendations developed by B and W during the simulation of a Massachusetts Institute of Technology (MIT) pressurizer test; and (5) cycle 36.1 updates received from EG and G

  16. RELAP5/MOD2 code assessment

    Energy Technology Data Exchange (ETDEWEB)

    Nithianandan, C.K.; Shah, N.H.; Schomaker, R.J.; Miller, F.R.

    1985-11-01

    Babcock and Wilcox (B and W) has been working with the code developers at EG and G and the US Nuclear Regulatory Commission in assessing the RELAP5/MOD2 computer code for the past year by simulating selected separate-effects tests. The purpose of this assessment has been to evaluate the code for use in MIST (Ref. 2) and OTIS integral system tests simulations and in the prediction of pressurized water reactor transients. B and W evaluated various versions of the code and made recommendations to improve code performance. As a result, the currently released version (cycle 36.1) has been improved considerably over earlier versions. However, further refinements to some of the constitutive models may still be needed to further improve the predictive capability of RELAP5/MOD2. The following versions of the code were evaluated. (1) RELAP/MOD2/Cycle 22 - first released version; (2) YELAP5/Cycle 32 - EG and G test version of RELAP5/MOD2/Cycle 32; (3) RELAP5/MOD2/Cycle 36 - frozen cycle for international code assessment; (4) updates to cycle 36 based on recommendations developed by B and W during the simulation of a Massachusetts Institute of Technology (MIT) pressurizer test; and (5) cycle 36.1 updates received from EG and G.

  17. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  18. nRC: non-coding RNA Classifier based on structural features.

    Science.gov (United States)

    Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso

    2017-01-01

    Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.

  19. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items to be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.

  20. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    International Nuclear Information System (INIS)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts' meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items to be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes

  1. Study of neoclassical transport and bootstrap current for W7-X in the 1/upsilon regime, using results from the PIES code

    International Nuclear Information System (INIS)

    Nemov, V V; Kalyuzhnyj, V N; Kasilov, S V; Drevlak, M; Nuehrenberg, J; Kernbichler, W; Reiman, A; Monticello, D

    2004-01-01

    For the magnetic field of the Wendelstein 7-X (W7-X) standard high-mirror configuration, computed by the PIES code, taking into account real coil geometry, neoclassical transport and bootstrap current are analysed in the 1/upsilon regime using methods based on the integration along magnetic field lines in a given magnetic field. The zero beta and (beta) = 1% cases are studied. The results are compared to the corresponding results for the vacuum magnetic field directly produced by modular coils. A significant advantage of W7-X over a conventional stellarator resulting from reduced neoclassical transport and from reduced bootstrap current follows from the computations although the neoclassical transport is somewhat larger than that previously obtained for the ideal W7-X model configuration

  2. RunJumpCode: An Educational Game for Educating Programming

    Science.gov (United States)

    Hinds, Matthew; Baghaei, Nilufar; Ragon, Pedrito; Lambert, Jonathon; Rajakaruna, Tharindu; Houghton, Travers; Dacey, Simon

    2017-01-01

    Programming promotes critical thinking, problem solving and analytic skills through creating solutions that can solve everyday problems. However, learning programming can be a daunting experience for a lot of students. "RunJumpCode" is an educational 2D platformer video game, designed and developed in Unity, to teach players the…

  3. E-learning tools for education: regulatory aspects, current applications in radiology and future prospects.

    Science.gov (United States)

    Pinto, A; Selvaggi, S; Sicignano, G; Vollono, E; Iervolino, L; Amato, F; Molinari, A; Grassi, R

    2008-02-01

    E-learning, an abbreviation of electronic learning, indicates the provision of education and training on the Internet or the World Wide Web. The impact of networks and the Internet on radiology is undoubtedly important, as it is for medicine as a whole. The Internet offers numerous advantages compared with other mass media: it provides access to a large amount of information previously known only to individual specialists; it is flexible, permitting the use of images or video; and it allows linking to Web sites on a specific subject, thus contributing to further expand knowledge. Our purpose is to illustrate the regulatory aspects (including Internet copyright laws), current radiological applications and future prospects of e-learning. Our experience with the installation of an e-learning platform is also presented. We performed a PubMed search on the published literature (without time limits) dealing with e-learning tools and applications in the health sector with specific reference to radiology. The search included all study types in the English language with the following key words: e-learning, education, teaching, online exam, radiology and radiologists. The Fiaso study was referred to for the regulatory aspects of e-learning. The application of e-learning to radiology requires the development of a model that involves selecting and creating e-learning platforms, creating and technologically adapting multimedia teaching modules, creating and managing a unified catalogue of teaching modules, planning training actions, defining training pathways and Continuing Education in Medicine (CME) credits, identifying levels of teaching and technological complexity of support tools, sharing an organisational and methodological model, training the trainers, operators' participation and relational devices, providing training, monitoring progress of the activities, and measuring the effectiveness of training. Since 2004, a platform--LiveLearning--has been used at our

  4. Upgrades to the WIMS-ANL code

    International Nuclear Information System (INIS)

    Woodruff, W. L.

    1998-01-01

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries

  5. Upgrades to the WIMS-ANL code

    International Nuclear Information System (INIS)

    Woodruff, W.L.; Leopando, L.S.

    1998-01-01

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries. (author)

  6. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  7. Game-Based Learning Theory

    Science.gov (United States)

    Laughlin, Daniel

    2008-01-01

    Persistent Immersive Synthetic Environments (PISE) are not just connection points, they are meeting places. They are the new public squares, village centers, malt shops, malls and pubs all rolled into one. They come with a sense of 'thereness" that engages the mind like a real place does. Learning starts as a real code. The code defines "objects." The objects exist in computer space, known as the "grid." The objects and space combine to create a "place." A "world" is created, Before long, the grid and code becomes obscure, and the "world maintains focus.

  8. A comparison of two three-dimensional shell-element transient electromagnetics codes

    International Nuclear Information System (INIS)

    Yugo, J.J.; Williamson, D.E.

    1992-01-01

    Electromagnetic forces due to eddy currents strongly influence the design of components for the next generation of fusion devices. An effort has been made to benchmark two computer programs used to generate transient electromagnetic loads: SPARK and EddyCuFF. Two simple transient field problems were analyzed, both of which had been previously analyzed by the SPARK code with results recorded in the literature. A third problem that uses an ITER inboard blanket benchmark model was analyzed as well. This problem was driven with a self-consistent, distributed multifilament plasma model generated by an axisymmetric physics code. The benchmark problems showed good agreement between the two shell-element codes. Variations in calculated eddy currents of 1--3% have been found for similar, finely meshed models. A difference of 8% was found in induced current and 20% in force for a coarse mesh and complex, multifilament field driver. Because comparisons were made to results obtained from literature, model preparation and code execution times were not evaluated

  9. Game E-Learning Code Master Dengan Konsep Mmorpg Menggunakan Adobe Flex 3

    OpenAIRE

    Fredy Purnomo; Monika Leslivania; Daniel Daniel; Lisye Mareta Cahya

    2010-01-01

    The research objective is to design a web-based e-learning game that could be used to be a learning facility of C language programming and as an online game so it could be enjoyed by everybody easily in internet. Flex usage in this game online is to implement RIA (Rich Internet Application) concept in the game so e-learning process is hoped to be more interesting and interactive. E-learning game is also designed in MMORPG (Massively Multiplayer Online Role Playing Game) concept. The research ...

  10. It takes two—coincidence coding within the dual olfactory pathway of the honeybee

    OpenAIRE

    Brill, Martin F.; Meyer, Anneke; Rössler, Wolfgang

    2015-01-01

    To rapidly process biologically relevant stimuli, sensory systems have developed a broad variety of coding mechanisms like parallel processing and coincidence detection. Parallel processing (e.g., in the visual system), increases both computational capacity and processing speed by simultaneously coding different aspects of the same stimulus. Coincidence detection is an efficient way to integrate information from different sources. Coincidence has been shown to promote associative learning and...

  11. Cross-index to DOE-prescribed occupational safety codes and standards

    International Nuclear Information System (INIS)

    1981-01-01

    This Cross-Index volume is the 1981 compilation of detailed information from more than three hundred and fifty DOE prescribed or OSHA referenced industrial safety codes and standards and is revised yearly to provide information from current codes. Condensed data from individual code portions are listed according to reference code, section, paragraph and page. Each code is given a two-digit reference code number or letter in the Contents section (pages C to L) of this volume. This reference code provides ready identification of any code listed in the Cross-Index. The computerized information listings are on the left-hand portion of Cross-Index page; in order to the right of the listing are the reference code letters or numbers, the section, paragraph and page of the referenced code containing expanded information on the individual listing

  12. Status report on the 'Merging' of the Electron-Cloud Code POSINST with the 3-D Accelerator PIC CODE WARP

    International Nuclear Information System (INIS)

    Vay, J.-L.; Furman, M.A.; Azevedo, A.W.; Cohen, R.H.; Friedman, A.; Grote, D.P.; Stoltz, P.H.

    2004-01-01

    We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE

  13. High-fidelity plasma codes for burn physics

    Energy Technology Data Exchange (ETDEWEB)

    Cooley, James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Graziani, Frank [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Marinak, Marty [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Murillo, Michael [Michigan State Univ., East Lansing, MI (United States)

    2016-10-19

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental data and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.

  14. Review of current severe accident management approaches in Europe and identification of related modelling requirements for the computer code ASTEC V2.1

    Energy Technology Data Exchange (ETDEWEB)

    Hermsmeyer, S. [European Commission JRC, Petten (Netherlands). Inst. for Energy and Transport; Herranz, L.E.; Iglesias, R. [CIEMAT, Madrid (Spain); and others

    2015-07-15

    The severe accident at the Fukushima-Daiichi nuclear power plant (NPP) has led to a worldwide review of nuclear safety approaches and is bringing a refocussing of R and D in the field. To support these efforts several new Euratom FP7 projects have been launched. The CESAM project focuses on the improvement of the ASTEC computer code. ASTEC is jointly developed by IRSN and GRS and is considered as the European reference code for Severe Accident Analyses since it capitalizes knowledge from the extensive Euro-pean R and D in the field. The project aims at the code's enhancement and extension for use in Severe Accident Management (SAM) analysis of the NPPs of Generation II-III presently under operation or foreseen in the near future in Europe, spent fuel pools included. The work reported here is concerned with the importance, for the further development of the code, of SAM strategies to be simulated. To this end, SAM strategies applied in the EU have been compiled. This compilation is mainly based on the public information made available in the frame of the EU ''stress tests'' for NPPs and has been complemented by information pro-vided by the different CESAM partners. The context of SAM is explained and the strategies are presented. The modelling capabilities for the simulation of these strategies in the current production version 2.0 of ASTEC are discussed. Furthermore, the requirements for the next version of ASTEC V2.1 that is supported in the CESAM project are highlighted. They are a necessary complement to the list of code improvements that is drawn from consolidating new fields of application, like SFP and BWR model enhancements, and from new experimental results on severe accident phenomena.

  15. Review of current severe accident management approaches in Europe and identification of related modelling requirements for the computer code ASTEC V2.1

    International Nuclear Information System (INIS)

    Hermsmeyer, S.

    2015-01-01

    The severe accident at the Fukushima-Daiichi nuclear power plant (NPP) has led to a worldwide review of nuclear safety approaches and is bringing a refocussing of R and D in the field. To support these efforts several new Euratom FP7 projects have been launched. The CESAM project focuses on the improvement of the ASTEC computer code. ASTEC is jointly developed by IRSN and GRS and is considered as the European reference code for Severe Accident Analyses since it capitalizes knowledge from the extensive Euro-pean R and D in the field. The project aims at the code's enhancement and extension for use in Severe Accident Management (SAM) analysis of the NPPs of Generation II-III presently under operation or foreseen in the near future in Europe, spent fuel pools included. The work reported here is concerned with the importance, for the further development of the code, of SAM strategies to be simulated. To this end, SAM strategies applied in the EU have been compiled. This compilation is mainly based on the public information made available in the frame of the EU ''stress tests'' for NPPs and has been complemented by information pro-vided by the different CESAM partners. The context of SAM is explained and the strategies are presented. The modelling capabilities for the simulation of these strategies in the current production version 2.0 of ASTEC are discussed. Furthermore, the requirements for the next version of ASTEC V2.1 that is supported in the CESAM project are highlighted. They are a necessary complement to the list of code improvements that is drawn from consolidating new fields of application, like SFP and BWR model enhancements, and from new experimental results on severe accident phenomena.

  16. GAMERA - The New Magnetospheric Code

    Science.gov (United States)

    Lyon, J.; Sorathia, K.; Zhang, B.; Merkin, V. G.; Wiltberger, M. J.; Daldorff, L. K. S.

    2017-12-01

    The Lyon-Fedder-Mobarry (LFM) code has been a main-line magnetospheric simulation code for 30 years. The code base, designed in the age of memory to memory vector ma- chines,is still in wide use for science production but needs upgrading to ensure the long term sustainability. In this presentation, we will discuss our recent efforts to update and improve that code base and also highlight some recent results. The new project GAM- ERA, Grid Agnostic MHD for Extended Research Applications, has kept the original design characteristics of the LFM and made significant improvements. The original de- sign included high order numerical differencing with very aggressive limiting, the ability to use arbitrary, but logically rectangular, grids, and maintenance of div B = 0 through the use of the Yee grid. Significant improvements include high-order upwinding and a non-clipping limiter. One other improvement with wider applicability is an im- proved averaging technique for the singularities in polar and spherical grids. The new code adopts a hybrid structure - multi-threaded OpenMP with an overarching MPI layer for large scale and coupled applications. The MPI layer uses a combination of standard MPI and the Global Array Toolkit from PNL to provide a lightweight mechanism for coupling codes together concurrently. The single processor code is highly efficient and can run magnetospheric simulations at the default CCMC resolution faster than real time on a MacBook pro. We have run the new code through the Athena suite of tests, and the results compare favorably with the codes available to the astrophysics community. LFM/GAMERA has been applied to many different situations ranging from the inner and outer heliosphere and magnetospheres of Venus, the Earth, Jupiter and Saturn. We present example results the Earth's magnetosphere including a coupled ring current (RCM), the magnetospheres of Jupiter and Saturn, and the inner heliosphere.

  17. Country Report on Building Energy Codes in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Shui, Bin; Evans, Meredydd

    2009-04-30

    This report is part of a series of reports on building energy efficiency codes in countries associated with the Asian Pacific Partnership (APP) - Australia, South Korea, Japan, China, India, and the United States of America (U.S.). This reports gives an overview of the development of building energy codes in U.S., including national energy policies related to building energy codes, history of building energy codes, recent national projects and activities to promote building energy codes. The report also provides a review of current building energy codes (such as building envelope, HVAC, lighting, and water heating) for commercial and residential buildings in the U.S.

  18. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  19. An Efficient Platform for the Automatic Extraction of Patterns in Native Code

    Directory of Open Access Journals (Sweden)

    Javier Escalada

    2017-01-01

    Full Text Available Different software tools, such as decompilers, code quality analyzers, recognizers of packed executable files, authorship analyzers, and malware detectors, search for patterns in binary code. The use of machine learning algorithms, trained with programs taken from the huge number of applications in the existing open source code repositories, allows finding patterns not detected with the manual approach. To this end, we have created a versatile platform for the automatic extraction of patterns from native code, capable of processing big binary files. Its implementation has been parallelized, providing important runtime performance benefits for multicore architectures. Compared to the single-processor execution, the average performance improvement obtained with the best configuration is 3.5 factors over the maximum theoretical gain of 4 factors.

  20. Some aspects on parental protection in the current Romanian Civil Code

    OpenAIRE

    Cristina Cojocaru

    2013-01-01

    The new Civil Code has come to meet the diversification and complexity of social relationships, the growing interference between economic and social life in Romania and the one in Europe and in the world and not least the connection in a greater extent of the Romanian law to the European law. The issues which could occur, given precisely such interconnection with the European law, are those which give rise to issues of civil law enforcement in space, especially when the question would be the ...

  1. Lessons learned in the verification, validation and application of a coupled heat and fluid flow code

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1986-01-01

    A summary is given of the authors recent studies in the verification, validation and application of a coupled heat and fluid flow code. Verification has been done against eight analytic and semi-analytic solutions. These solutions include those involving thermal buoyancy flow and fracture flow. Comprehensive field validation studies over a period of four years are discussed. The studies are divided into three stages: (1) history matching, (2) double-blind prediction and confirmation, (3) design optimization. At each stage, parameter sensitivity studies are performed. To study the applications of mathematical models, a problem proposed by the International Energy Agency (IEA) is solved using this verified and validated numerical model as well as two simpler models. One of the simpler models is a semi-analytic method assuming the uncoupling of the heat and fluid flow processes. The other is a graphical method based on a large number of approximations. Variations are added to the basic IEA problem to point out the limits of ranges of applications of each model. A number of lessons are learned from the above investigations. These are listed and discussed

  2. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  3. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  4. TERRESTRIAL LASER SCANNER DATA DENOISING BY DICTIONARY LEARNING OF SPARSE CODING

    Directory of Open Access Journals (Sweden)

    E. Smigiel

    2013-07-01

    Full Text Available Point cloud processing is basically a signal processing issue. The huge amount of data which are collected with Terrestrial Laser Scanners or photogrammetry techniques faces the classical questions linked with signal or image processing. Among others, denoising and compression are questions which have to be addressed in this context. That is why, one has to turn attention to signal theory because it is susceptible to guide one's good practices or to inspire new ideas from the latest developments of this field. The literature have been showing for decades how strong and dynamic, the theoretical field is and how efficient the derived algorithms have become. For about ten years, a new technique has appeared: known as compressive sensing or compressive sampling, it is based first on sparsity which is an interesting characteristic of many natural signals. Based on this concept, many denoising and compression techniques have shown their efficiencies. Sparsity can also be seen as redundancy removal of natural signals. Taken along with incoherent measurements, compressive sensing has appeared and uses the idea that redundancy could be removed at the very early stage of sampling. Hence, instead of sampling the signal at high sampling rate and removing redundancy as a second stage, the acquisition stage itself may be run with redundancy removal. This paper gives some theoretical aspects of these ideas with first simple mathematics. Then, the idea of compressive sensing for a Terrestrial Laser Scanner is examined as a potential research question and finally, a denoising scheme based on a dictionary learning of sparse coding is experienced. Both the theoretical discussion and the obtained results show that it is worth staying close to signal processing theory and its community to take benefit of its latest developments.

  5. 1995 building energy codes and standards workshops: Summary and documentation

    Energy Technology Data Exchange (ETDEWEB)

    Sandahl, L.J.; Shankle, D.L.

    1996-02-01

    During the spring of 1995, Pacific Northwest National Laboratory (PNNL) conducted four two-day Regional Building Energy Codes and Standards workshops across the US. Workshops were held in Chicago, Denver, Rhode Island, and Atlanta. The workshops were designed to benefit state-level officials including staff of building code commissions, energy offices, public utility commissions, and others involved with adopting/updating, implementing, and enforcing building energy codes in their states. The workshops provided an opportunity for state and other officials to learn more about residential and commercial building energy codes and standards, the role of the US Department of Energy and the Building Standards and Guidelines Program at Pacific Northwest National Laboratory, Home Energy Rating Systems (HERS), Energy Efficient Mortgages (EEM), training issues, and other topics related to the development, adoption, implementation, and enforcement of building energy codes. Participants heard success stories, got tips on enforcement training, and received technical support materials. In addition to receiving information on the above topics, workshop participants had an opportunity to provide input on code adoption issues, building industry training issues, building design issues, and exemplary programs across the US. This paper documents the workshop planning, findings, and follow-up processes.

  6. ARTEMIS: The core simulator of AREVA NP's next generation coupled neutronics/thermal-hydraulics code system ARCADIAR

    International Nuclear Information System (INIS)

    Hobson, Greg; Merk, Stephan; Bolloni, Hans-Wilhelm; Breith, Karl-Albert; Curca-Tivig, Florin; Van Geemert, Rene; Heinecke, Jochen; Hartmann, Bettina; Porsch, Dieter; Tiles, Viatcheslav; Dall'Osso, Aldo; Pothet, Baptiste

    2008-01-01

    AREVA NP has developed a next-generation coupled neutronics/thermal-hydraulics code system, ARCADIA R , to fulfil customer's current demands and even anticipate their future demands in terms of accuracy and performance. The new code system will be implemented world-wide and will replace several code systems currently used in various global regions. An extensive phase of verification and validation of the new code system is currently in progress. One of the principal components of this new system is the core simulator, ARTEMIS. Besides the stand-alone tests on the individual computational modules, integrated tests on the overall code are being performed in order to check for non-regression as well as for verification of the code. Several benchmark problems have been successfully calculated. Full-core depletion cycles of different plant types from AREVA's French, American and German regions (e.g. N4 and KONVOI types) have been performed with ARTEMIS (using APOLLO2-A cross sections) and compared directly with current production codes, e.g. with SCIENCE and CASCADE-3D, and additionally with measurements. (authors)

  7. Continuous Materiality: Through a Hierarchy of Computational Codes

    Directory of Open Access Journals (Sweden)

    Jichen Zhu

    2008-01-01

    Full Text Available The legacy of Cartesian dualism inherent in linguistic theory deeply influences current views on the relation between natural language, computer code, and the physical world. However, the oversimplified distinction between mind and body falls short of capturing the complex interaction between the material and the immaterial. In this paper, we posit a hierarchy of codes to delineate a wide spectrum of continuous materiality. Our research suggests that diagrams in architecture provide a valuable analog for approaching computer code in emergent digital systems. After commenting on ways that Cartesian dualism continues to haunt discussions of code, we turn our attention to diagrams and design morphology. Finally we notice the implications a material understanding of code bears for further research on the relation between human cognition and digital code. Our discussion concludes by noticing several areas that we have projected for ongoing research.

  8. SRAC2006: A comprehensive neutronics calculation code system

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Kugo, Teruhiko; Kaneko, Kunio; Tsuchihashi, Keichiro

    2007-02-01

    The SRAC is a code system applicable to neutronics analysis of a variety of reactor types. Since the publication of the second version of the users manual (JAERI-1302) in 1986 for the SRAC system, a number of additions and modifications to the functions and the library data have been made to establish a comprehensive neutronics code system. The current system includes major neutron data libraries (JENDL-3.3, JENDL-3.2, ENDF/B-VII, ENDF/B-VI.8, JEFF-3.1, JEF-2.2, etc.), and integrates five elementary codes for neutron transport and diffusion calculation; PIJ based on the collision probability method applicable to 16 kind of lattice models, S N transport codes ANISN(1D) and TWOTRN(2D), diffusion codes TUD(1D) and CITATION(multi-D). The system also includes an auxiliary code COREBN for multi-dimensional core burn-up calculation. (author)

  9. The arbitrary order design code Tlie 1.0

    International Nuclear Information System (INIS)

    Zeijts, J. van; Neri, Filippo

    1993-01-01

    We describe the arbitrary order charged particle transfer map code TLIE. This code is a general 6D relativistic design code with a MAD compatible input language and among others implements user defined functions and subroutines and nested fitting and optimization. First we describe the mathematics and physics in the code. Aside from generating maps for all the standard accelerator elements we describe an efficient method for generating nonlinear transfer maps for realistic magnet models. We have implemented the method to arbitrary order in our accelerator design code for cylindrical current sheet magnets. We also have implemented a self-consistent space-charge approach as in CHARLIE. Subsequently we give a description of the input language and finally, we give several examples from productions run, such as cases with stacked multipoles with overlapping fringe fields. (Author)

  10. International assessment of PCA codes

    International Nuclear Information System (INIS)

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE

  11. Learning Analytics for Supporting Seamless Language Learning Using E-Book with Ubiquitous Learning System

    Science.gov (United States)

    Mouri, Kousuke; Uosaki, Noriko; Ogata, Hiroaki

    2018-01-01

    Seamless learning has been recognized as an effective learning approach across various dimensions including formal and informal learning contexts, individual and social learning, and physical world and cyberspace. With the emergence of seamless learning, the majority of the current research focuses on realizing a seamless learning environment at…

  12. Coupled geochemical and solute transport code development

    International Nuclear Information System (INIS)

    Morrey, J.R.; Hostetler, C.J.

    1985-01-01

    A number of coupled geochemical hydrologic codes have been reported in the literature. Some of these codes have directly coupled the source-sink term to the solute transport equation. The current consensus seems to be that directly coupling hydrologic transport and chemical models through a series of interdependent differential equations is not feasible for multicomponent problems with complex geochemical processes (e.g., precipitation/dissolution reactions). A two-step process appears to be the required method of coupling codes for problems where a large suite of chemical reactions must be monitored. Two-step structure requires that the source-sink term in the transport equation is supplied by a geochemical code rather than by an analytical expression. We have developed a one-dimensional two-step coupled model designed to calculate relatively complex geochemical equilibria (CTM1D). Our geochemical module implements a Newton-Raphson algorithm to solve heterogeneous geochemical equilibria, involving up to 40 chemical components and 400 aqueous species. The geochemical module was designed to be efficient and compact. A revised version of the MINTEQ Code is used as a parent geochemical code

  13. Module description of TOKAMAK equilibrium code MEUDAS

    International Nuclear Information System (INIS)

    Suzuki, Masaei; Hayashi, Nobuhiko; Matsumoto, Taro; Ozeki, Takahisa

    2002-01-01

    The analysis of an axisymmetric MHD equilibrium serves as a foundation of TOKAMAK researches, such as a design of devices and theoretical research, the analysis of experiment result. For this reason, also in JAERI, an efficient MHD analysis code has been developed from start of TOKAMAK research. The free boundary equilibrium code ''MEUDAS'' which uses both the DCR method (Double-Cyclic-Reduction Method) and a Green's function can specify the pressure and the current distribution arbitrarily, and has been applied to the analysis of a broad physical subject as a code having rapidity and high precision. Also the MHD convergence calculation technique in ''MEUDAS'' has been built into various newly developed codes. This report explains in detail each module in ''MEUDAS'' for performing convergence calculation in solving the MHD equilibrium. (author)

  14. Learning Illustrated: An Exploratory Cross-Sectional Drawing Analysis of Students' Conceptions of Learning

    Science.gov (United States)

    Hsieh, Wen-Min; Tsai, Chin-Chung

    2018-01-01

    Using the draw-a-picture technique, the authors explored the learning conceptions held by students across grade levels. A total of 1,067 Taiwanese students in Grades 2, 4, 6, 8, 10, and 12 participated in this study. Participants were asked to use drawing to illustrate how they conceptualize learning. A coding checklist was developed to analyze…

  15. Application of nuclear air cleaning and treatment codes

    International Nuclear Information System (INIS)

    Kriskovich, J.R.

    1995-01-01

    All modifications to existing ventilation systems, as well as any new ventilation systems used on the Hanford Site are required to meet both American Society of Mechanical Engineers (ASME) codes N509 and N510. Difficulties encountered when applying code N509 at the Hanford Site include the composition of the ventilation air stream and requirements related to ventilation equipment procurement. Also, the existing ventilation systems for the waste tanks at the Hanford Site cannot be tested in accordance with code N510 because of the current configuration of these systems

  16. Application of nuclear air cleaning and treatment codes

    Energy Technology Data Exchange (ETDEWEB)

    Kriskovich, J.R. [Westinghouse Hanford Company, Richland, WA (United States)

    1995-02-01

    All modifications to existing ventilation systems, as well as any new ventilation systems used on the Hanford Site are required to meet both American Society of Mechanical Engineers (ASME) codes N509 and N510. Difficulties encountered when applying code N509 at the Hanford Site include the composition of the ventilation air stream and requirements related to ventilation equipment procurement. Also, the existing ventilation systems for the waste tanks at the Hanford Site cannot be tested in accordance with code N510 because of the current configuration of these systems.

  17. OPR1000 RCP Flow Coastdown Analysis using SPACE Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong-Hyuk; Kim, Seyun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The Korean nuclear industry developed a thermal-hydraulic analysis code for the safety analysis of PWRs, named SPACE(Safety and Performance Analysis Code for Nuclear Power Plant). Current loss of flow transient analysis of OPR1000 uses COAST code to calculate transient RCS(Reactor Coolant System) flow. The COAST code calculates RCS loop flow using pump performance curves and RCP(Reactor Coolant Pump) inertia. In this paper, SPACE code is used to reproduce RCS flowrates calculated by COAST code. The loss of flow transient is transient initiated by reduction of forced reactor coolant circulation. Typical loss of flow transients are complete loss of flow(CLOF) and locked rotor(LR). OPR1000 RCP flow coastdown analysis was performed using SPACE using simplified nodalization. Complete loss of flow(4 RCP trip) was analyzed. The results show good agreement with those from COAST code, which is CE code for calculating RCS flow during loss of flow transients. Through this study, we confirmed that SPACE code can be used instead of COAST code for RCP flow coastdown analysis.

  18. Current and Future Trends in Game-Based Learning

    Directory of Open Access Journals (Sweden)

    Carlos Vaz de Carvalho

    2014-05-01

    Full Text Available The first number of the second volume of the EAI Transactions on Serious Games focuses on the results presented on the European Conference on Game-Based Learning. This event, already on the 8th edition, has set standards in terms of presentation of research and practice and in the pointing out of new and future trends in the development of Game-Based Learning. As such, we are quite thrilled to be able to report them here.

  19. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  20. Software requirements specification document for the AREST code development

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.; Whitney, P.D.; Gray, W.J.; Williford, R.E.; White, M.D.; Eslinger, P.W.; Altenhofen, M.K.

    1993-11-01

    The Analysis of the Repository Source Term (AREST) computer code was selected in 1992 by the U.S. Department of Energy. The AREST code will be used to analyze the performance of an underground high level nuclear waste repository. The AREST code is being modified by the Pacific Northwest Laboratory (PNL) in order to evaluate the engineered barrier and waste package designs, model regulatory compliance, analyze sensitivities, and support total systems performance assessment modeling. The current version of the AREST code was developed to be a very useful tool for analyzing model uncertainties and sensitivities to input parameters. The code has also been used successfully in supplying source-terms that were used in a total systems performance assessment. The current version, however, has been found to be inadequate for the comparison and selection of a design for the waste package. This is due to the assumptions and simplifications made in the selection of the process and system models. Thus, the new version of the AREST code will be designed to focus on the details of the individual processes and implementation of more realistic models. This document describes the requirements of the new models that will be implemented. Included in this document is a section describing the near-field environmental conditions for this waste package modeling, description of the new process models that will be implemented, and a description of the computer requirements for the new version of the AREST code

  1. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  2. Probabilistic hypergraph based hash codes for social image search

    Institute of Scientific and Technical Information of China (English)

    Yi XIE; Hui-min YU; Roland HU

    2014-01-01

    With the rapid development of the Internet, recent years have seen the explosive growth of social media. This brings great challenges in performing efficient and accurate image retrieval on a large scale. Recent work shows that using hashing methods to embed high-dimensional image features and tag information into Hamming space provides a powerful way to index large collections of social images. By learning hash codes through a spectral graph partitioning algorithm, spectral hashing (SH) has shown promising performance among various hashing approaches. However, it is incomplete to model the relations among images only by pairwise simple graphs which ignore the relationship in a higher order. In this paper, we utilize a probabilistic hypergraph model to learn hash codes for social image retrieval. A probabilistic hypergraph model offers a higher order repre-sentation among social images by connecting more than two images in one hyperedge. Unlike a normal hypergraph model, a probabilistic hypergraph model considers not only the grouping information, but also the similarities between vertices in hy-peredges. Experiments on Flickr image datasets verify the performance of our proposed approach.

  3. NASA space radiation transport code development consortium

    International Nuclear Information System (INIS)

    Townsend, L. W.

    2005-01-01

    Recently, NASA established a consortium involving the Univ. of Tennessee (lead institution), the Univ. of Houston, Roanoke College and various government and national laboratories, to accelerate the development of a standard set of radiation transport computer codes for NASA human exploration applications. This effort involves further improvements of the Monte Carlo codes HETC and FLUKA and the deterministic code HZETRN, including developing nuclear reaction databases necessary to extend the Monte Carlo codes to carry out heavy ion transport, and extending HZETRN to three dimensions. The improved codes will be validated by comparing predictions with measured laboratory transport data, provided by an experimental measurements consortium, and measurements in the upper atmosphere on the balloon-borne Deep Space Test Bed (DSTB). In this paper, we present an overview of the consortium members and the current status and future plans of consortium efforts to meet the research goals and objectives of this extensive undertaking. (authors)

  4. Repetitive Transcranial Direct Current Stimulation Induced Excitability Changes of Primary Visual Cortex and Visual Learning Effects-A Pilot Study.

    Science.gov (United States)

    Sczesny-Kaiser, Matthias; Beckhaus, Katharina; Dinse, Hubert R; Schwenkreis, Peter; Tegenthoff, Martin; Höffken, Oliver

    2016-01-01

    Studies on noninvasive motor cortex stimulation and motor learning demonstrated cortical excitability as a marker for a learning effect. Transcranial direct current stimulation (tDCS) is a non-invasive tool to modulate cortical excitability. It is as yet unknown how tDCS-induced excitability changes and perceptual learning in visual cortex correlate. Our study aimed to examine the influence of tDCS on visual perceptual learning in healthy humans. Additionally, we measured excitability in primary visual cortex (V1). We hypothesized that anodal tDCS would improve and cathodal tDCS would have minor or no effects on visual learning. Anodal, cathodal or sham tDCS were applied over V1 in a randomized, double-blinded design over four consecutive days (n = 30). During 20 min of tDCS, subjects had to learn a visual orientation-discrimination task (ODT). Excitability parameters were measured by analyzing paired-stimulation behavior of visual-evoked potentials (ps-VEP) and by measuring phosphene thresholds (PTs) before and after the stimulation period of 4 days. Compared with sham-tDCS, anodal tDCS led to an improvement of visual discrimination learning (p learning effect. For cathodal tDCS, no significant effects on learning or on excitability could be seen. Our results showed that anodal tDCS over V1 resulted in improved visual perceptual learning and increased cortical excitability. tDCS is a promising tool to alter V1 excitability and, hence, perceptual visual learning.

  5. Computer codes for shaping the magnetic field of the JINR phasotron

    International Nuclear Information System (INIS)

    Zaplatin, N.L.; Morozov, N.A.

    1983-01-01

    The computer codes providing for the shaping the magnetic field of the JINR high current phasotron are presented. Using these codes the control for the magnetic field mapping was realized in on- or off-line regimes. Then these field parameters were calculated and ferromagnetic correcting elements and trim coils setting were chosen. Some computer codes were realised for the magnetic field horizontal component measurements. The data are presented on some codes possibilities. The codes were used on the EC-1010 and the CDC-6500 computers

  6. A Theoretical Analysis of Learning with Graphics--Implications for Computer Graphics Design.

    Science.gov (United States)

    ChanLin, Lih-Juan

    This paper reviews the literature pertinent to learning with graphics. The dual coding theory provides explanation about how graphics are stored and precessed in semantic memory. The level of processing theory suggests how graphics can be employed in learning to encourage deeper processing. In addition to dual coding theory and level of processing…

  7. A program for undergraduate research into the mechanisms of sensory coding and memory decay

    Energy Technology Data Exchange (ETDEWEB)

    Calin-Jageman, R J

    2010-09-28

    This is the final technical report for this DOE project, entitltled "A program for undergraduate research into the mechanisms of sensory coding and memory decay". The report summarizes progress on the three research aims: 1) to identify phyisological and genetic correlates of long-term habituation, 2) to understand mechanisms of olfactory coding, and 3) to foster a world-class undergraduate neuroscience program. Progress on the first aim has enabled comparison of learning-regulated transcripts across closely related learning paradigms and species, and results suggest that only a small core of transcripts serve truly general roles in long-term memory. Progress on the second aim has enabled testing of several mutant phenotypes for olfactory behaviors, and results show that responses are not fully consistent with the combinitoral coding hypothesis. Finally, 14 undergraduate students participated in this research, the neuroscience program attracted extramural funding, and we completed a successful summer program to enhance transitions for community-college students into 4-year colleges to persue STEM fields.

  8. Reviewing the current state of machine learning for artificial intelligence with regards to the use of contextual information

    OpenAIRE

    Kinch, Martin W.; Melis, Wim J.C.; Keates, Simeon

    2017-01-01

    This paper will consider the current state of Machine Learning for Artificial Intelligence, more specifically for applications, such as: Speech Recognition, Game Playing and Image Processing. The artificial world tends to make limited use of context in comparison to what currently happens in human life, while it would benefit from improvements in this area. Additionally, the process of transferring knowledge between application domains is another important area where artificial system can imp...

  9. 1-D hybrid code for FRM start-up

    International Nuclear Information System (INIS)

    Stark, R.A.; Miley, G.H.

    1982-01-01

    A one-D hybrid has been developed to study the start-up of the FRM via neutral-beam injection. The code uses a multi-group numerical model originally developed by J. Willenberg to describe fusion product dynamics in a solenoidal plasma. Earlier we described such a model for use in determining self-consistent ion currents and magnetic fields in FRM start-up. However, consideration of electron dynamics during start-up indicate that the electron current will oppose the injected ion current and may even foil the attempt to achieve reversal. For this reason, we have combined the multi-group ion (model) with a fluid treatment for electron dynamics to form the hybrid code FROST (Field Reversed One-dimensional STart-up). The details of this merger, along with sample results of operation of FROST, are given

  10. Teacher Feedback during Active Learning: Current Practices in Primary Schools

    Science.gov (United States)

    van den Bergh, Linda; Ros, Anje; Beijaard, Douwe

    2013-01-01

    Background: Feedback is one of the most powerful tools, which teachers can use to enhance student learning. It appears dif?cult for teachers to give qualitatively good feedback, especially during active learning. In this context, teachers should provide facilitative feedback that is focused on the development of meta-cognition and social learning.…

  11. TRACK The New Beam Dynamics Code

    CERN Document Server

    Mustapha, Brahim; Ostroumov, Peter; Schnirman-Lessner, Eliane

    2005-01-01

    The new ray-tracing code TRACK was developed* to fulfill the special requirements of the RIA accelerator systems. The RIA lattice includes an ECR ion source, a LEBT containing a MHB and a RFQ followed by three SC linac sections separated by two stripping stations with appropriate magnetic transport systems. No available beam dynamics code meet all the necessary requirements for an end-to-end simulation of the RIA driver linac. The latest version of TRACK was used for end-to-end simulations of the RIA driver including errors and beam loss analysis.** In addition to the standard capabilities, the code includes the following new features: i) multiple charge states ii) realistic stripper model; ii) static and dynamic errors iii) automatic steering to correct for misalignments iv) detailed beam-loss analysis; v) parallel computing to perform large scale simulations. Although primarily developed for simulations of the RIA machine, TRACK is a general beam dynamics code. Currently it is being used for the design and ...

  12. Machine learning of the reactor core loading pattern critical parameters

    International Nuclear Information System (INIS)

    Trontl, K.; Pevec, D.; Smuc, T.

    2007-01-01

    The usual approach to loading pattern optimization involves high degree of engineering judgment, a set of heuristic rules, an optimization algorithm and a computer code used for evaluating proposed loading patterns. The speed of the optimization process is highly dependent on the computer code used for the evaluation. In this paper we investigate the applicability of a machine learning model which could be used for fast loading pattern evaluation. We employed a recently introduced machine learning technique, Support Vector Regression (SVR), which has a strong theoretical background in statistical learning theory. Superior empirical performance of the method has been reported on difficult regression problems in different fields of science and technology. SVR is a data driven, kernel based, nonlinear modelling paradigm, in which model parameters are automatically determined by solving a quadratic optimization problem. The main objective of the work reported in this paper was to evaluate the possibility of applying SVR method for reactor core loading pattern modelling. The starting set of experimental data for training and testing of the machine learning algorithm was obtained using a two-dimensional diffusion theory reactor physics computer code. We illustrate the performance of the solution and discuss its applicability, i.e., complexity, speed and accuracy, with a projection to a more realistic scenario involving machine learning from the results of more accurate and time consuming three-dimensional core modelling code. (author)

  13. Development of thermal hydraulic models for the reliable regulatory auditing code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, B. D.; Song, C. H.; Lee, Y. J.; Kwon, T. S.; Lee, S. W. [Korea Automic Energy Research Institute, Taejon (Korea, Republic of)

    2004-02-15

    The objective of this project is to develop thermal hydraulic models for use in improving the reliability of the regulatory auditing codes. The current year fall under the second step of the 3 year project, and the main researches were focused on the development of downcorner boiling model. During the current year, the bubble stream model of downcorner has been developed and installed in he auditing code. The model sensitivity analysis has been performed for APR1400 LBLOCA scenario using the modified code. The preliminary calculation has been performed for the experimental test facility using FLUENT and MARS code. The facility for air bubble experiment has been installed. The thermal hydraulic phenomena for VHTR and super critical reactor have been identified for the future application and model development.

  14. Deciphering Neural Codes of Memory during Sleep

    Science.gov (United States)

    Chen, Zhe; Wilson, Matthew A.

    2017-01-01

    Memories of experiences are stored in the cerebral cortex. Sleep is critical for consolidating hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms on memory consolidation, and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here, we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches for analyzing sleep-associated neural codes (SANC). We focus on two analysis paradigms for sleep-associated memory, and propose a new unsupervised learning framework (“memory first, meaning later”) for unbiased assessment of SANC. PMID:28390699

  15. Learning How to Learn

    DEFF Research Database (Denmark)

    Lauridsen, Karen M.; Lauridsen, Ole

    Ole Lauridsen, Aarhus School of Business and Social Sciences, Aarhus University, Denmark Karen M. Lauridsen, Aarhus School of Business and Social Sciences, Aarhus University, Denmark Learning Styles in Higher Education – Learning How to Learn Applying learning styles (LS) in higher education...... by Constructivist learning theory and current basic knowledge of how the brain learns. The LS concept will thus be placed in a broader learning theoretical context as a strong learning and teaching tool. Participants will be offered the opportunity to have their own LS preferences established before...... teaching leads to positive results and enhanced student learning. However, learning styles should not only be considered a didactic matter for the teacher, but also a tool for the individual students to improve their learning capabilities – not least in contexts where information is not necessarily...

  16. Effects of Anodal Transcranial Direct Current Stimulation on Visually Guided Learning of Grip Force Control

    Directory of Open Access Journals (Sweden)

    Tamas Minarik

    2015-03-01

    Full Text Available Anodal transcranial Direct Current Stimulation (tDCS has been shown to be an effective non-invasive brain stimulation method for improving cognitive and motor functioning in patients with neurological deficits. tDCS over motor cortex (M1, for instance, facilitates motor learning in stroke patients. However, the literature on anodal tDCS effects on motor learning in healthy participants is inconclusive, and the effects of tDCS on visuo-motor integration are not well understood. In the present study we examined whether tDCS over the contralateral motor cortex enhances learning of grip-force output in a visually guided feedback task in young and neurologically healthy volunteers. Twenty minutes of 1 mA anodal tDCS were applied over the primary motor cortex (M1 contralateral to the dominant (right hand, during the first half of a 40 min power-grip task. This task required the control of a visual signal by modulating the strength of the power-grip for six seconds per trial. Each participant completed a two-session sham-controlled crossover protocol. The stimulation conditions were counterbalanced across participants and the sessions were one week apart. Performance measures comprised time-on-target and target-deviation, and were calculated for the periods of stimulation (or sham and during the afterphase respectively. Statistical analyses revealed significant performance improvements over the stimulation and the afterphase, but this learning effect was not modulated by tDCS condition. This suggests that the form of visuomotor learning taking place in the present task was not sensitive to neurostimulation. These null effects, together with similar reports for other types of motor tasks, lead to the proposition that tDCS facilitation of motor learning might be restricted to cases or situations where the motor system is challenged, such as motor deficits, advanced age, or very high task demand.

  17. B2-B2.5 code benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Dekeyser, W.; Baelmans, M; Voskoboynikov, S.; Rozhansky, V.; Reiter, D.; Wiesen, S.; Kotov, V.; Boerner, P.

    2011-01-15

    ITER-IO currently (and since about 15 years) employs the SOLPS4.xxx code for its divertor design, currently version SOLPS4.3. SOLPS.xxx is a special variant of the B2-EIRENE code, which was originally developed by an European consortium (FZ Juelich, AEA Culham, ERM Belgium/KU Leuven) in the late eighties and early nineties of the last century under NET contracts. Until today even the very similar edge plasma codes within the SOLPS family, if run on a seemingly identical choice of physical parameters, still sometimes disagree significantly with each other. It is obvious that in computational engineering applications, as they are carried out for the various ITER divertor aspects with SOLPS4.3 for more than a decade now, any transition from one to another code must be fully backward compatible, or, at least, the origin of differences in the results must be identified and fully understood quantitatively. In this report we document efforts undertaken in 2010 to ultimately eliminate the third issue. For the kinetic EIRENE part within SOLPS this backward compatibility (back until 1996) was basically achieved (V. Kotov, 2004-2006) and SOLPS4.3 is now essentially up to date with the current EIRENE master maintained at FZ Juelich. In order to achieve a similar level of reproducibility for the plasma fluid (B2, B2.5) part, we follow a similar strategy, which is quite distinct from the previous SOLPS benchmark attempts: the codes are ''disintegrated'' and pieces of it are run on smallest (i.e. simplest) problems. Only after full quantitative understanding is achieved, the code model is enlarged, integrated, piece by piece again, until, hopefully, a fully backward compatible B2 / B2.5 ITER edge plasma simulation will be achieved. The status of this code dis-integration effort and its findings until now (Nov. 2010) are documented in the present technical note. This work was initiated in a small workshop by the three partner teams of KU Leuven, St. Petersburg

  18. Void fraction prediction of NUPEC PSBT tests by CATHARE code

    International Nuclear Information System (INIS)

    Del Nevo, A.; Michelotti, L.; Moretti, F.; Rozzia, D.; D'Auria, F.

    2011-01-01

    The current generation of thermal-hydraulic system codes benefits of about sixty years of experiments and forty years of development and are considered mature tools to provide best estimate description of phenomena and detailed reactor system representations. However, there are continuous needs for checking the code capabilities in representing nuclear system, for drawing attention to their weak points, for identifying models which need to be refined for best-estimate calculations. Prediction of void fraction and Departure from Nucleate Boiling (DNB) in system thermal-hydraulics is currently based on empirical approaches. The database carried out by Nuclear Power Engineering Corporation (NUPEC), Japan addresses these issues. It is suitable for supporting the development of new computational tools based on more mechanistic approaches (i.e. three-field codes, two-phase CFD, etc.) as well as for validating current generation of thermal-hydraulic system codes. Selected experiments belonging to this database are used for the OECD/NRC PSBT benchmark. The paper reviews the activity carried out by CATHARE2 code on the basis of the subchannel (four test sections) and presents rod bundle (different axial power profile and test sections) experiments available in the database in steady state and transient conditions. The results demonstrate the accuracy of the code in predicting the void fraction in different thermal-hydraulic conditions. The tests are performed varying the pressure, coolant temperature, mass flow and power. Sensitivity analyses are carried out addressing nodalization effect and the influence of the initial and boundary conditions of the tests. (author)

  19. Ozone - Current Air Quality Index

    Science.gov (United States)

    GO! Local Air Quality Conditions Zip Code: State : My Current Location Current AQI Forecast AQI Loop More Maps AQI: Good (0 - 50) ... resources for Hawaii residents and visitors more announcements Air Quality Basics Air Quality Index | Ozone | Particle Pollution | Smoke ...

  20. Game E-Learning Code Master Dengan Konsep Mmorpg Menggunakan Adobe Flex 3

    Directory of Open Access Journals (Sweden)

    Fredy Purnomo

    2010-12-01

    Full Text Available The research objective is to design a web-based e-learning game that could be used to be a learning facility of C language programming and as an online game so it could be enjoyed by everybody easily in internet. Flex usage in this game online is to implement RIA (Rich Internet Application concept in the game so e-learning process is hoped to be more interesting and interactive. E-learning game is also designed in MMORPG (Massively Multiplayer Online Role Playing Game concept. The research method used is analysis and design method. Analysis method is done through literature study, user analysis, and similar game analysis. Meanwhile, design method is about monitor display, gameplay and system design. The conclution of this research is that this game provides an interesting learning media of C language program accordingly to subject material at class and also easier to use through website.