WorldWideScience

Sample records for previously learned codes

  1. Code-Mixing and Code Switchingin The Process of Learning

    Directory of Open Access Journals (Sweden)

    Diyah Atiek Mustikawati

    2016-09-01

    Full Text Available This study aimed to describe a form of code switching and code mixing specific form found in the teaching and learning activities in the classroom as well as determining factors influencing events stand out that form of code switching and code mixing in question.Form of this research is descriptive qualitative case study which took place in Al Mawaddah Boarding School Ponorogo. Based on the analysis and discussion that has been stated in the previous chapter that the form of code mixing and code switching learning activities in Al Mawaddah Boarding School is in between the use of either language Java language, Arabic, English and Indonesian, on the use of insertion of words, phrases, idioms, use of nouns, adjectives, clauses, and sentences. Code mixing deciding factor in the learning process include: Identification of the role, the desire to explain and interpret, sourced from the original language and its variations, is sourced from a foreign language. While deciding factor in the learning process of code, includes: speakers (O1, partners speakers (O2, the presence of a third person (O3, the topic of conversation, evoke a sense of humour, and just prestige. The significance of this study is to allow readers to see the use of language in a multilingual society, especially in AL Mawaddah boarding school about the rules and characteristics variation in the language of teaching and learning activities in the classroom. Furthermore, the results of this research will provide input to the ustadz / ustadzah and students in developing oral communication skills and the effectiveness of teaching and learning strategies in boarding schools.

  2. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...... noise residual learning techniques that take residues from previously decoded frames into account to estimate the decoding residue more precisely. Moreover, the techniques calculate a number of candidate noise residual distributions within a frame to adaptively optimize the soft side information during...

  3. Optimal interference code based on machine learning

    Science.gov (United States)

    Qian, Ye; Chen, Qian; Hu, Xiaobo; Cao, Ercong; Qian, Weixian; Gu, Guohua

    2016-10-01

    In this paper, we analyze the characteristics of pseudo-random code, by the case of m sequence. Depending on the description of coding theory, we introduce the jamming methods. We simulate the interference effect or probability model by the means of MATLAB to consolidate. In accordance with the length of decoding time the adversary spends, we find out the optimal formula and optimal coefficients based on machine learning, then we get the new optimal interference code. First, when it comes to the phase of recognition, this study judges the effect of interference by the way of simulating the length of time over the decoding period of laser seeker. Then, we use laser active deception jamming simulate interference process in the tracking phase in the next block. In this study we choose the method of laser active deception jamming. In order to improve the performance of the interference, this paper simulates the model by MATLAB software. We find out the least number of pulse intervals which must be received, then we can make the conclusion that the precise interval number of the laser pointer for m sequence encoding. In order to find the shortest space, we make the choice of the greatest common divisor method. Then, combining with the coding regularity that has been found before, we restore pulse interval of pseudo-random code, which has been already received. Finally, we can control the time period of laser interference, get the optimal interference code, and also increase the probability of interference as well.

  4. Fact or fiction: updates on how protein-coding genes might emergede novofrom previously non-coding DNA.

    Science.gov (United States)

    Schmitz, Jonathan F; Bornberg-Bauer, Erich

    2017-01-01

    Over the last few years, there has been an increasing amount of evidence for the de novo emergence of protein-coding genes, i.e. out of non-coding DNA. Here, we review the current literature and summarize the state of the field. We focus specifically on open questions and challenges in the study of de novo protein-coding genes such as the identification and verification of de novo -emerged genes. The greatest obstacle to date is the lack of high-quality genomic data with very short divergence times which could help precisely pin down the location of origin of a de novo gene. We conclude that, while there is plenty of evidence from a genetics perspective, there is a lack of functional studies of bona fide de novo genes and almost no knowledge about protein structures and how they come about during the emergence of de novo protein-coding genes. We suggest that future studies should concentrate on the functional and structural characterization of de novo protein-coding genes as well as the detailed study of the emergence of functional de novo protein-coding genes.

  5. Three Methods for Occupation Coding Based on Statistical Learning

    Directory of Open Access Journals (Sweden)

    Gweon Hyukjun

    2017-03-01

    Full Text Available Occupation coding, an important task in official statistics, refers to coding a respondent’s text answer into one of many hundreds of occupation codes. To date, occupation coding is still at least partially conducted manually, at great expense. We propose three methods for automatic coding: combining separate models for the detailed occupation codes and for aggregate occupation codes, a hybrid method that combines a duplicate-based approach with a statistical learning algorithm, and a modified nearest neighbor approach. Using data from the German General Social Survey (ALLBUS, we show that the proposed methods improve on both the coding accuracy of the underlying statistical learning algorithm and the coding accuracy of duplicates where duplicates exist. Further, we find defining duplicates based on ngram variables (a concept from text mining is preferable to one based on exact string matches.

  6. When sparse coding meets ranking: a joint framework for learning sparse codes and ranking scores

    KAUST Repository

    Wang, Jim Jing-Yan

    2017-06-28

    Sparse coding, which represents a data point as a sparse reconstruction code with regard to a dictionary, has been a popular data representation method. Meanwhile, in database retrieval problems, learning the ranking scores from data points plays an important role. Up to now, these two problems have always been considered separately, assuming that data coding and ranking are two independent and irrelevant problems. However, is there any internal relationship between sparse coding and ranking score learning? If yes, how to explore and make use of this internal relationship? In this paper, we try to answer these questions by developing the first joint sparse coding and ranking score learning algorithm. To explore the local distribution in the sparse code space, and also to bridge coding and ranking problems, we assume that in the neighborhood of each data point, the ranking scores can be approximated from the corresponding sparse codes by a local linear function. By considering the local approximation error of ranking scores, the reconstruction error and sparsity of sparse coding, and the query information provided by the user, we construct a unified objective function for learning of sparse codes, the dictionary and ranking scores. We further develop an iterative algorithm to solve this optimization problem.

  7. Stochastic organization of output codes in multiclass learning problems.

    Science.gov (United States)

    Utschick, W; Weichselberger, W

    2001-05-01

    The best-known decomposition schemes of multiclass learning problems are one per class coding (OPC) and error-correcting output coding (ECOC). Both methods perform a prior decomposition, that is, before training of the classifier takes place. The impact of output codes on the inferred decision rules can be experienced only after learning. Therefore, we present a novel algorithm for the code design of multiclass learning problems. This algorithm applies a maximum-likelihood objective function in conjunction with the expectation-maximization (EM) algorithm. Minimizing the augmented objective function yields the optimal decomposition of the multiclass learning problem in two-class problems. Experimental results show the potential gain of the optimized output codes over OPC or ECOC methods.

  8. Code-specific learning rules improve action selection by populations of spiking neurons.

    Science.gov (United States)

    Friedrich, Johannes; Urbanczik, Robert; Senn, Walter

    2014-08-01

    Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.

  9. Blending Classroom Teaching and Learning with QR Codes

    Science.gov (United States)

    Rikala, Jenni; Kankaanranta, Marja

    2014-01-01

    The aim of this case study was to explore the feasibility of the Quick Response (QR) codes and mobile devices in the context of Finnish basic education. The interest was especially to explore how mobile devices and QR codes can enhance and blend teaching and learning. The data were collected with a teacher interview and pupil surveys. The learning…

  10. WITHDRAWAL OF PREVIOUS COMPLAINT. A COMPARISON OF THE OLD AND THE NEW CRIMINAL CODE. PROBLEMS OF COMPARATIVE LAW

    Directory of Open Access Journals (Sweden)

    Alin Sorin NICOLESCU

    2015-07-01

    Full Text Available In criminal law previous complaint has a double legal valence, material and procedural in nature, constituting a condition for criminal liability, but also a functional condition in cases expressly and limitatively provided by law, a consequence of criminal sanction condition. For certain offenses criminal law determines the initiation of the criminal complaint by the introduction of previous complaint by the injured party, without its absence being a question of removing criminal liability. From the perspective of criminal material law conditioning of the existence of previous complaint ,its lack and withdrawal, are regulated by art. 157 and 158 of the New Penal Code, with significant changes in relation to the old regulation of the institution . In terms of procedural aspect , previous complaint is regulated in art. 295-298 of the New Code of Criminal Procedure. Regarding the withdrawal of the previuos complaint, in the case of offenses for which the initiation of criminal proceedings is subject to the existence of such a complaint, we note that in the current Criminal Code this legal institution is regulated separately, representing both a cause for removal of criminal liability and a cause that preclude criminal action. This unilateral act of the will of the injured party - the withdrawal of the previous complaint, may be exercised only under certain conditions, namely: it can only be promoted in the case of the offenses for which the initiation of criminal proceedings is subject to the introduction of a previous complaint; it is made exclusively by the rightholder, by legal representatives or with the consent of the persons required by law for persons lacking legal capacity or having limited legal capacity;it must intervene until giving final judgment and it must represent an express and explicit manifestation. A novelty isrepresented by the possibility of withdrawing previous complaint if the prosecution was driven ex officio, although for

  11. Evaluating QR Code Case Studies Using a Mobile Learning Framework

    Science.gov (United States)

    Rikala, Jenni

    2014-01-01

    The aim of this study was to evaluate the feasibility of Quick Response (QR) codes and mobile devices in the context of Finnish basic education. The feasibility was analyzed through a mobile learning framework, which includes the core characteristics of mobile learning. The study is part of a larger research where the aim is to develop a…

  12. Analysis of previous perceptual and motor experience in breaststroke kick learning

    Directory of Open Access Journals (Sweden)

    Ried Bettina

    2015-12-01

    Full Text Available One of the variables that influence motor learning is the learner’s previous experience, which may provide perceptual and motor elements to be transferred to a novel motor skill. For swimming skills, several motor experiences may prove effective. Purpose. The aim was to analyse the influence of previous experience in playing in water, swimming lessons, and music or dance lessons on learning the breaststroke kick. Methods. The study involved 39 Physical Education students possessing basic swimming skills, but not the breaststroke, who performed 400 acquisition trials followed by 50 retention and 50 transfer trials, during which stroke index as well as rhythmic and spatial configuration indices were mapped, and answered a yes/no questionnaire regarding previous experience. Data were analysed by ANOVA (p = 0.05 and the effect size (Cohen’s d ≥0.8 indicating large effect size. Results. The whole sample improved their stroke index and spatial configuration index, but not their rhythmic configuration index. Although differences between groups were not significant, two types of experience showed large practical effects on learning: childhood water playing experience only showed major practically relevant positive effects, and no experience in any of the three fields hampered the learning process. Conclusions. The results point towards diverse impact of previous experience regarding rhythmic activities, swimming lessons, and especially with playing in water during childhood, on learning the breaststroke kick.

  13. Learning Short Binary Codes for Large-scale Image Retrieval.

    Science.gov (United States)

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  14. Code-Switching Functions in Modern Hebrew Teaching and Learning

    Science.gov (United States)

    Gilead, Yona

    2016-01-01

    The teaching and learning of Modern Hebrew outside of Israel is essential to Jewish education and identity. One of the most contested issues in Modern Hebrew pedagogy is the use of code-switching between Modern Hebrew and learners' first language. Moreover, this is one of the longest running disputes in the broader field of second language…

  15. A Learning Environment for English Vocabulary Using Quick Response Codes

    Science.gov (United States)

    Arikan, Yuksel Deniz; Ozen, Sevil Orhan

    2015-01-01

    This study focuses on the process of developing a learning environment that uses tablets and Quick Response (QR) codes to enhance participants' English language vocabulary knowledge. The author employed the concurrent triangulation strategy, a mixed research design. The study was conducted at a private school in Izmir, Turkey during the 2012-2013…

  16. "My math and me": Nursing students' previous experiences in learning mathematics.

    Science.gov (United States)

    Røykenes, Kari

    2016-01-01

    In this paper, 11 narratives about former experiences in learning of mathematics written by nursing students are thematically analyzed. Most students had a positive relationship with the subject in primary school, when they found mathematics fun and were able to master the subject. For some, a change occurred in the transition to lower secondary school. The reasons for this change was found in the subject (increased difficulty), the teachers (movement of teachers, numerous substitute teachers), the class environment and size (many pupils, noise), and the student him- or herself (silent and anonymous pupil). This change was also found in the transition from lower to higher secondary school. By contrast, some students had experienced changes that were positive, and their mathematics teacher was a significant factor in this positive change. The paper emphasizes the importance of previous experiences in learning mathematics to nursing students when learning about drug calculation. Copyright © 2015. Published by Elsevier Ltd.

  17. Deep Learning Methods for Improved Decoding of Linear Codes

    Science.gov (United States)

    Nachmani, Eliya; Marciano, Elad; Lugosch, Loren; Gross, Warren J.; Burshtein, David; Be'ery, Yair

    2018-02-01

    The problem of low complexity, close to optimal, channel decoding of linear codes with short to moderate block length is considered. It is shown that deep learning methods can be used to improve a standard belief propagation decoder, despite the large example space. Similar improvements are obtained for the min-sum algorithm. It is also shown that tying the parameters of the decoders across iterations, so as to form a recurrent neural network architecture, can be implemented with comparable results. The advantage is that significantly less parameters are required. We also introduce a recurrent neural decoder architecture based on the method of successive relaxation. Improvements over standard belief propagation are also observed on sparser Tanner graph representations of the codes. Furthermore, we demonstrate that the neural belief propagation decoder can be used to improve the performance, or alternatively reduce the computational complexity, of a close to optimal decoder of short BCH codes.

  18. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    Science.gov (United States)

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  19. Fact or fiction: updates on how protein-coding genes might emerge de novo from previously non-coding DNA [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Jonathan F Schmitz

    2017-01-01

    Full Text Available Over the last few years, there has been an increasing amount of evidence for the de novo emergence of protein-coding genes, i.e. out of non-coding DNA. Here, we review the current literature and summarize the state of the field. We focus specifically on open questions and challenges in the study of de novo protein-coding genes such as the identification and verification of de novo-emerged genes. The greatest obstacle to date is the lack of high-quality genomic data with very short divergence times which could help precisely pin down the location of origin of a de novo gene. We conclude that, while there is plenty of evidence from a genetics perspective, there is a lack of functional studies of bona fide de novo genes and almost no knowledge about protein structures and how they come about during the emergence of de novo protein-coding genes. We suggest that future studies should concentrate on the functional and structural characterization of de novo protein-coding genes as well as the detailed study of the emergence of functional de novo protein-coding genes.

  20. De novo ORFs in Drosophila are important to organismal fitness and evolved rapidly from previously non-coding sequences.

    Directory of Open Access Journals (Sweden)

    Josephine A Reinhardt

    Full Text Available How non-coding DNA gives rise to new protein-coding genes (de novo genes is not well understood. Recent work has revealed the origins and functions of a few de novo genes, but common principles governing the evolution or biological roles of these genes are unknown. To better define these principles, we performed a parallel analysis of the evolution and function of six putatively protein-coding de novo genes described in Drosophila melanogaster. Reconstruction of the transcriptional history of de novo genes shows that two de novo genes emerged from novel long non-coding RNAs that arose at least 5 MY prior to evolution of an open reading frame. In contrast, four other de novo genes evolved a translated open reading frame and transcription within the same evolutionary interval suggesting that nascent open reading frames (proto-ORFs, while not required, can contribute to the emergence of a new de novo gene. However, none of the genes arose from proto-ORFs that existed long before expression evolved. Sequence and structural evolution of de novo genes was rapid compared to nearby genes and the structural complexity of de novo genes steadily increases over evolutionary time. Despite the fact that these genes are transcribed at a higher level in males than females, and are most strongly expressed in testes, RNAi experiments show that most of these genes are essential in both sexes during metamorphosis. This lethality suggests that protein coding de novo genes in Drosophila quickly become functionally important.

  1. TU-CD-BRD-01: Making Incident Learning Practical and Useful: Challenges and Previous Experiences

    International Nuclear Information System (INIS)

    Ezzell, G.

    2015-01-01

    aside for audience members to contribute to the discussion. Learning Objectives: Learn how to promote the use of an incident learning system in a clinic. Learn how to convert “event reporting” into “incident learning”. See examples of practice changes that have come out of learning systems. Learn how the RO-ILS system can be used as a primary internal learning system. Learn how to create succinct, meaningful reports useful to outside readers. Gary Ezzell chairs the AAPM committee overseeing RO-ILS and has received an honorarium from ASTRO for working on the committee reviewing RO-ILS reports. Derek Brown is a director of http://TreatSafely.org . Brett Miller has previously received travel expenses and an honorarium from Varian. Phillip Beron has nothing to report

  2. Chronic impairments in spatial learning and memory in rats previously exposed to chlorpyrfos or diisopropylfluorophosphate.

    Science.gov (United States)

    Terry, A V; Beck, W D; Warner, S; Vandenhuerk, L; Callahan, P M

    2012-01-01

    The acute toxicity of organophosphates (OPs) has been studied extensively; however, much less attention has been given to the subject of repeated exposures that are not associated with overt signs of toxicity (i.e., subthreshold exposures). The objective of this study was to determine if the protracted spatial learning impairments we have observed previously after repeated subthreshold exposures to the insecticide chlorpyrifos (CPF) or the alkylphosphate OP, diisopropylfluorophosphate (DFP) persisted for longer periods after exposure. Male Wistar rats (beginning at two months of age) were initially injected subcutaneously with CPF (10.0 or 18.0mg/kg) or DFP (0.25 or 0.75 mg/kg) every other day for 30 days. After an extended OP-free washout period (behavioral testing begun 50 days after the last OP exposure), rats previously exposed to CPF, but not DFP, were impaired in a radial arm maze (RAM) win-shift task as well as a delayed non-match to position procedure. Later experiments (i.e., beginning 140 days after the last OP exposure) revealed impairments in the acquisition of a water maze hidden platform task associated with both OPs. However, only rats previously exposed to DFP were impaired in a second phase of testing when the platform location was changed (indicative of deficits of cognitive flexibility). These results indicate, therefore, that repeated, subthreshold exposures to CPF and DFP may lead to chronic deficits in spatial learning and memory (i.e., long after cholinesterase inhibition has abated) and that insecticide and alkylphosphate-based OPs may have differential effects depending on the cognitive domain evaluated. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Learning Midlevel Auditory Codes from Natural Sound Statistics.

    Science.gov (United States)

    Młynarski, Wiktor; McDermott, Josh H

    2018-03-01

    Interaction with the world requires an organism to transform sensory signals into representations in which behaviorally meaningful properties of the environment are made explicit. These representations are derived through cascades of neuronal processing stages in which neurons at each stage recode the output of preceding stages. Explanations of sensory coding may thus involve understanding how low-level patterns are combined into more complex structures. To gain insight into such midlevel representations for sound, we designed a hierarchical generative model of natural sounds that learns combinations of spectrotemporal features from natural stimulus statistics. In the first layer, the model forms a sparse convolutional code of spectrograms using a dictionary of learned spectrotemporal kernels. To generalize from specific kernel activation patterns, the second layer encodes patterns of time-varying magnitude of multiple first-layer coefficients. When trained on corpora of speech and environmental sounds, some second-layer units learned to group similar spectrotemporal features. Others instantiate opponency between distinct sets of features. Such groupings might be instantiated by neurons in the auditory cortex, providing a hypothesis for midlevel neuronal computation.

  4. Sequential Compact Code Learning for Unsupervised Image Hashing.

    Science.gov (United States)

    Liu, Li; Shao, Ling

    2016-12-01

    Effective hashing for large-scale image databases is a popular research area, attracting much attention in computer vision and visual information retrieval. Several recent methods attempt to learn either graph embedding or semantic coding for fast and accurate applications. In this paper, a novel unsupervised framework, termed evolutionary compact embedding (ECE), is introduced to automatically learn the task-specific binary hash codes. It can be regarded as an optimization algorithm that combines the genetic programming (GP) and a boosting trick. In our architecture, each bit of ECE is iteratively computed using a weak binary classification function, which is generated through GP evolving by jointly minimizing its empirical risk with the AdaBoost strategy on a training set. We address this as greedy optimization by embedding high-dimensional data points into a similarity-preserved Hamming space with a low dimension. We systematically evaluate ECE on two data sets, SIFT 1M and GIST 1M, showing the effectiveness and the accuracy of our method for a large-scale similarity search.

  5. Using QR Codes to Differentiate Learning for Gifted and Talented Students

    Science.gov (United States)

    Siegle, Del

    2015-01-01

    QR codes are two-dimensional square patterns that are capable of coding information that ranges from web addresses to links to YouTube video. The codes save time typing and eliminate errors in entering addresses incorrectly. These codes make learning with technology easier for students and motivationally engage them in news ways.

  6. Coding oriented learning in economics, business and finance

    Directory of Open Access Journals (Sweden)

    Francisco Salas-Molina

    2018-02-01

    Full Text Available As the relationship between both students (teachers and information technology evolves, new tools are required to improve learning (teaching in social sciences. Economics, business and finance are mainly based on data and dealing with data requires specific skills and techniques such as computer programming in order to get full potential of most quantitative models. In this paper, we propose a coding oriented learning method based on Python Notebooks which is specifically designed for students of degrees in economics, business and finance. We follow a learning-by-doing strategy that encourages students to implement economic models as a suitable way to improve the understanding of fundamental concepts. As an illustrative example, we also describe a case study in which Python Notebooks are the key tool to teach cash management in a Master in Business Administration program. Since students of today are the decision-makers of tomorrow, a further advantage of the use of a programming language as a teaching tool is the possibility to connect theory to practice by enabling students to implement their own decision support tools.

  7. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    Science.gov (United States)

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  8. The Challenges of Teaching Qualitative Coding: Can a Learning Object Help?

    Science.gov (United States)

    Raddon, Mary-Beth; Raby, Rebecca; Sharpe, Erin

    2009-01-01

    Challenged by some of the inherent difficulties in teaching qualitative data analysis, three instructors created an interactive digital learning object entitled "Sleuthing the Layered Text: Investigating Coding." In this paper we assess the effectiveness of that learning object as a tool for teaching qualitative coding. On the face of it, learning…

  9. Supporting Situated Learning Based on QR Codes with Etiquetar App: A Pilot Study

    Science.gov (United States)

    Camacho, Miguel Olmedo; Pérez-Sanagustín, Mar; Alario-Hoyos, Carlos; Soldani, Xavier; Kloos, Carlos Delgado; Sayago, Sergio

    2014-01-01

    EtiquetAR is an authoring tool for supporting the design and enactment of situated learning experiences based on QR tags. Practitioners use etiquetAR for creating, managing and personalizing collections of QR codes with special properties: (1) codes can have more than one link pointing at different multimedia resources, (2) codes can be updated…

  10. Teaching Qualitative Research: Experiential Learning in Group-Based Interviews and Coding Assignments

    Science.gov (United States)

    DeLyser, Dydia; Potter, Amy E.

    2013-01-01

    This article describes experiential-learning approaches to conveying the work and rewards involved in qualitative research. Seminar students interviewed one another, transcribed or took notes on those interviews, shared those materials to create a set of empirical materials for coding, developed coding schemes, and coded the materials using those…

  11. Development of Learning Management in Moral Ethics and Code of Ethics of the Teaching Profession Course

    Science.gov (United States)

    Boonsong, S.; Siharak, S.; Srikanok, V.

    2018-02-01

    The purposes of this research were to develop the learning management, which was prepared for the enhancement of students’ Moral Ethics and Code of Ethics in Rajamangala University of Technology Thanyaburi (RMUTT). The contextual study and the ideas for learning management development was conducted by the document study, focus group method and content analysis from the document about moral ethics and code of ethics of the teaching profession concerning Graduate Diploma for Teaching Profession Program. The main tools of this research were the summarize papers and analyse papers. The results of development showed the learning management for the development of moral ethics and code of ethics of the teaching profession for Graduate Diploma for Teaching Profession students could promote desired moral ethics and code of ethics of the teaching profession character by the integrated learning techniques which consisted of Service Learning, Contract System, Value Clarification, Role Playing, and Concept Mapping. The learning management was presented in 3 steps.

  12. A Latin Functionalist Dictionary as a Self-Learning Language Device: Previous Experiences to Digitalization

    Science.gov (United States)

    Márquez, Manuel; Chaves, Beatriz

    2016-01-01

    The application of a methodology based on S.C. Dik's Functionalist Grammar linguistic principles, which is addressed to the teaching of Latin to secondary students, has resulted in a quantitative improvement in students' acquisition process of knowledge. To do so, we have used a self-learning tool, an ad hoc dictionary, of which the use in…

  13. A Latin Functionalist Dictionary as a Self-Learning Language Device: Previous Experiences to Digitalization

    Directory of Open Access Journals (Sweden)

    Cruz Manuel Márquez

    2016-07-01

    Full Text Available The application of a methodology based on S.C. Dik’s Functionalist Grammar linguistic principles, which is addressed to the teaching of Latin to secondary students, has resulted in a quantitative improvement in students’ acquisition process of knowledge. To do so, we have used a self-learning tool, an ad hoc dictionary, of which the use in different practices has made students understand, at a basic level, the functioning of this language.

  14. Understanding infants' and children's social learning about foods: previous research and new prospects.

    Science.gov (United States)

    Shutts, Kristin; Kinzler, Katherine D; DeJesus, Jasmine M

    2013-03-01

    Developmental psychologists have devoted significant attention to investigating how children learn from others' actions, emotions, and testimony. Yet most of this research has examined children's socially guided learning about artifacts. The present article focuses on a domain that has received limited attention from those interested in the development of social cognition: food. We begin by reviewing the available literature on infants' and children's development in the food domain and identify situations in which children evidence both successes and failures in their interactions with foods. We focus specifically on the role that other people play in guiding what children eat and argue that understanding patterns of successes and failures in the food domain requires an appreciation of eating as a social phenomenon. We next propose a series of questions for future research and suggest that examining food selection as a social phenomenon can shed light on mechanisms underlying children's learning from others and provide ideas for promoting healthy social relationships and eating behaviors early in development.

  15. Risk Communication Strategies: Lessons Learned from Previous Disasters with a Focus on the Fukushima Radiation Accident.

    Science.gov (United States)

    Svendsen, Erik R; Yamaguchi, Ichiro; Tsuda, Toshihide; Guimaraes, Jean Remy Davee; Tondel, Martin

    2016-12-01

    It has been difficult to both mitigate the health consequences and effectively provide health risk information to the public affected by the Fukushima radiological disaster. Often, there are contrasting public health ethics within these activities which complicate risk communication. Although no risk communication strategy is perfect in such disasters, the ethical principles of risk communication provide good practical guidance. These discussions will be made in the context of similar lessons learned after radiation exposures in Goiania, Brazil, in 1987; the Chernobyl nuclear power plant accident, Ukraine, in 1986; and the attack at the World Trade Center, New York, USA, in 2001. Neither of the two strategies is perfect nor fatally flawed. Yet, this discussion and lessons from prior events should assist decision makers with navigating difficult risk communication strategies in similar environmental health disasters.

  16. Towards a universal code formatter through machine learning

    NARCIS (Netherlands)

    Parr, T. (Terence); J.J. Vinju (Jurgen)

    2016-01-01

    textabstractThere are many declarative frameworks that allow us to implement code formatters relatively easily for any specific language, but constructing them is cumbersome. The first problem is that "everybody" wants to format their code differently, leading to either many formatter variants or a

  17. Quick Response (QR) Codes for Audio Support in Foreign Language Learning

    Science.gov (United States)

    Vigil, Kathleen Murray

    2017-01-01

    This study explored the potential benefits and barriers of using quick response (QR) codes as a means by which to provide audio materials to middle-school students learning Spanish as a foreign language. Eleven teachers of Spanish to middle-school students created transmedia materials containing QR codes linking to audio resources. Students…

  18. Prevention of Tetanus Outbreak Following Natural Disaster in Indonesia: Lessons Learned from Previous Disasters.

    Science.gov (United States)

    Pascapurnama, Dyshelly Nurkartika; Murakami, Aya; Chagan-Yasutan, Haorile; Hattori, Toshio; Sasaki, Hiroyuki; Egawa, Shinichi

    2016-03-01

    In Indonesia, the Aceh earthquake and tsunami in 2004 killed 127,000 people and caused half a million injuries, while the Yogyakarta earthquake in 2006 caused 5,700 deaths and 37,000 injuries. Because disaster-affected areas are vulnerable to epidemic-prone diseases and tetanus is one such disease that is preventable, we systematically reviewed the literature related to tetanus outbreaks following previous two natural disasters in Indonesia. Based on our findings, recommendations for proper vaccination and education can be made for future countermeasures. Using specified keywords related to tetanus and disasters, relevant documents were screened from PubMed, the WHO website, and books. Reports offering limited data and those released before 2004 were excluded. In all, 16 publications were reviewed systematically. Results show that 106 cases of tetanus occurred in Aceh, with a case fatality ratio (CFR) of 18.9%; 71 cases occurred in Yogyakarta, with CFR of 36.6%. For both outbreaks, most patients had been wounded during scavenging or evacuation after the disaster occurred. Poor access to health care because of limited transportation or hospital facilities, and low vaccination coverage and lack of awareness of tetanus risk contributed to delayed treatment and case severity. Tetanus outbreaks after disasters are preventable by increasing vaccination coverage, improving wound care treatment, and establishing a regular surveillance system, in addition to good practices of disaster management and supportive care following national guidelines. Furthermore, health education for communities should be provided to raise awareness of tetanus risk reduction.

  19. Developing a Code of Practice for Learning Analytics

    Science.gov (United States)

    Sclater, Niall

    2016-01-01

    Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…

  20. Enhancing Nursing and Midwifery Student Learning Through the Use of QR Codes.

    Science.gov (United States)

    Downer, Terri; Oprescu, Florin; Forbes, Helen; Phillips, Nikki; McTier, Lauren; Lord, Bill; Barr, Nigel; Bright, Peter; Simbag, Vilma

    A recent teaching and learning innovation using new technologies involves the use of quick response codes, which are read by smartphones and tablets. Integrating this technology as a teaching and learning strategy in nursing and midwifery education has been embraced by academics and students at a regional university.

  1. AAC menu interface: effectiveness of active versus passive learning to master abbreviation-expansion codes.

    Science.gov (United States)

    Gregory, Ellyn; Soderman, Melinda; Ward, Christy; Beukelman, David R; Hux, Karen

    2006-06-01

    This study investigated the accuracy with which 30 young adults without disabilities learned abbreviation expansion codes associated with specific vocabulary items that were stored in an AAC device with two accessing methods: mouse access and keyboard access. Both accessing methods utilized a specialized computer application, called AAC Menu, which allowed for errorless practice. Mouse access prompted passive learning, whereas keyboard access prompted active learning. Results revealed that participants who accessed words via a keyboard demonstrated significantly higher mastery of abbreviation-expansion codes than those who accessed words via a computer mouse.

  2. Learning concepts, language, and literacy in hybrid linguistic codes ...

    African Journals Online (AJOL)

    Vygotskian ideas on children's cognitive development and its interplay with language in an argument for a linguistically 'stable' pedagogy that prepares learners for the world of written language in which they have to express most of their learning ...

  3. Machine-Learning Algorithms to Code Public Health Spending Accounts.

    Science.gov (United States)

    Brady, Eoghan S; Leider, Jonathon P; Resnick, Beth A; Alfonso, Y Natalia; Bishai, David

    Government public health expenditure data sets require time- and labor-intensive manipulation to summarize results that public health policy makers can use. Our objective was to compare the performances of machine-learning algorithms with manual classification of public health expenditures to determine if machines could provide a faster, cheaper alternative to manual classification. We used machine-learning algorithms to replicate the process of manually classifying state public health expenditures, using the standardized public health spending categories from the Foundational Public Health Services model and a large data set from the US Census Bureau. We obtained a data set of 1.9 million individual expenditure items from 2000 to 2013. We collapsed these data into 147 280 summary expenditure records, and we followed a standardized method of manually classifying each expenditure record as public health, maybe public health, or not public health. We then trained 9 machine-learning algorithms to replicate the manual process. We calculated recall, precision, and coverage rates to measure the performance of individual and ensembled algorithms. Compared with manual classification, the machine-learning random forests algorithm produced 84% recall and 91% precision. With algorithm ensembling, we achieved our target criterion of 90% recall by using a consensus ensemble of ≥6 algorithms while still retaining 93% coverage, leaving only 7% of the summary expenditure records unclassified. Machine learning can be a time- and cost-saving tool for estimating public health spending in the United States. It can be used with standardized public health spending categories based on the Foundational Public Health Services model to help parse public health expenditure information from other types of health-related spending, provide data that are more comparable across public health organizations, and evaluate the impact of evidence-based public health resource allocation.

  4. Machine learning-based coding unit depth decisions for flexible complexity allocation in high efficiency video coding.

    Science.gov (United States)

    Zhang, Yun; Kwong, Sam; Wang, Xu; Yuan, Hui; Pan, Zhaoqing; Xu, Long

    2015-07-01

    In this paper, we propose a machine learning-based fast coding unit (CU) depth decision method for High Efficiency Video Coding (HEVC), which optimizes the complexity allocation at CU level with given rate-distortion (RD) cost constraints. First, we analyze quad-tree CU depth decision process in HEVC and model it as a three-level of hierarchical binary decision problem. Second, a flexible CU depth decision structure is presented, which allows the performances of each CU depth decision be smoothly transferred between the coding complexity and RD performance. Then, a three-output joint classifier consists of multiple binary classifiers with different parameters is designed to control the risk of false prediction. Finally, a sophisticated RD-complexity model is derived to determine the optimal parameters for the joint classifier, which is capable of minimizing the complexity in each CU depth at given RD degradation constraints. Comparative experiments over various sequences show that the proposed CU depth decision algorithm can reduce the computational complexity from 28.82% to 70.93%, and 51.45% on average when compared with the original HEVC test model. The Bjøntegaard delta peak signal-to-noise ratio and Bjøntegaard delta bit rate are -0.061 dB and 1.98% on average, which is negligible. The overall performance of the proposed algorithm outperforms those of the state-of-the-art schemes.

  5. A Machine Learning Perspective on Predictive Coding with PAQ

    OpenAIRE

    Knoll, Byron; de Freitas, Nando

    2011-01-01

    PAQ8 is an open source lossless data compression algorithm that currently achieves the best compression rates on many benchmarks. This report presents a detailed description of PAQ8 from a statistical machine learning perspective. It shows that it is possible to understand some of the modules of PAQ8 and use this understanding to improve the method. However, intuitive statistical explanations of the behavior of other modules remain elusive. We hope the description in this report will be a sta...

  6. Just in time? Using QR codes for multi-professional learning in clinical practice.

    Science.gov (United States)

    Jamu, Joseph Tawanda; Lowi-Jones, Hannah; Mitchell, Colin

    2016-07-01

    Clinical guidelines and policies are widely available on the hospital intranet or from the internet, but can be difficult to access at the required time and place. Clinical staff with smartphones could use Quick Response (QR) codes for contemporaneous access to relevant information to support the Just in Time Learning (JIT-L) paradigm. There are several studies that advocate the use of smartphones to enhance learning amongst medical students and junior doctors in UK. However, these participants are already technologically orientated. There are limited studies that explore the use of smartphones in nursing practice. QR Codes were generated for each topic and positioned at relevant locations on a medical ward. Support and training were provided for staff. Website analytics and semi-structured interviews were performed to evaluate the efficacy, acceptability and feasibility of using QR codes to facilitate Just in Time learning. Use was intermittently high but not sustained. Thematic analysis of interviews revealed a positive assessment of the Just in Time learning paradigm and context-sensitive clinical information. However, there were notable barriers to acceptance, including usability of QR codes and appropriateness of smartphone use in a clinical environment. The use of Just in Time learning for education and reference may be beneficial to healthcare professionals. However, alternative methods of access for less technologically literate users and a change in culture of mobile device use in clinical areas may be needed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Distinct patterns of functional and structural neuroplasticity associated with learning Morse code.

    Science.gov (United States)

    Schmidt-Wilcke, T; Rosengarth, K; Luerding, R; Bogdahn, U; Greenlee, M W

    2010-07-01

    Learning is based on neuroplasticity, i.e. on the capability of the brain to adapt to new experiences. Different mechanisms of neuroplasticity have been described, ranging from synaptic remodeling to changes in complex neural circuitry. To further study the relationship between changes in neural activity and changes in gray matter density associated with learning, we performed a combined longitudinal functional and morphometric magnetic resonance imaging (MRI) study on healthy volunteers who learned to decipher Morse code. We investigated 16 healthy subjects using functional MR imaging (fMRI) and voxel-based morphometry (VBM) before and after they had learned to decipher Morse code. The same set of Morse-code signals was presented to participants pre- and post-training. We found an increase in task-specific neural activity in brain regions known to be critically involved in language perception and memory, such as the inferior parietal cortex bilaterally and the medial parietal cortex during Morse code deciphering. Furthermore we found an increase in gray matter density in the left occipitotemporal region, extending into the fusiform gyrus. Anatomically neighboring sites of functional and structural neuroplasticity were revealed in the left occipitotemporal/inferior temporal cortex, but these regions only marginally overlapped. Implications of this morpho-functional dissociation for learning concepts are discussed. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  8. Learning Morse Code Alters Microstructural Properties in the Inferior Longitudinal Fasciculus: A DTI Study.

    Science.gov (United States)

    Schlaffke, Lara; Leemans, Alexander; Schweizer, Lauren M; Ocklenburg, Sebastian; Schmidt-Wilcke, Tobias

    2017-01-01

    Learning relies on neuroplasticity, which has mainly been studied in gray matter (GM). However, there is mounting evidence indicating a critical role of white matter changes involved in learning processes. One of the most important learning processes in human development is language acquisition. However, due to the length of this learning process, it has been notoriously difficult to investigate the underlying neuroplastic changes. Here, we report a novel learning paradigm to assess the role of white matter plasticity for language acquisition. By acoustically presenting Morse Code (MC) using an in house developed audio book as a model for language-type learning, we generated a well-controlled learning environment that allows for the detection of subtle white matter changes related to language type learning in a much shorter time frame than usual language acquisition. In total 12 letters of the MC alphabet were learned within six learning session, which allowed study participants to perform a word recognition MC decoding task. In this study, we found that learning MC was associated with significant microstructural changes in the left inferior longitudinal fasciculus (ILF). The fractional anisotropy (FA) of this associative fiber bundle connecting the occipital and posterior temporal cortex with the temporal pole as well as the hippocampus and amygdala was increased. Furthermore, white matter plasticity was associated with task performance of MC decoding, indicating that the structural changes were related to learning efficiency. In conclusion, our findings demonstrate an important role of white matter neuroplasticity for acquiring a new language skill.

  9. Learning Morse Code Alters Microstructural Properties in the Inferior Longitudinal Fasciculus: A DTI Study

    Directory of Open Access Journals (Sweden)

    Lara Schlaffke

    2017-07-01

    Full Text Available Learning relies on neuroplasticity, which has mainly been studied in gray matter (GM. However, there is mounting evidence indicating a critical role of white matter changes involved in learning processes. One of the most important learning processes in human development is language acquisition. However, due to the length of this learning process, it has been notoriously difficult to investigate the underlying neuroplastic changes. Here, we report a novel learning paradigm to assess the role of white matter plasticity for language acquisition. By acoustically presenting Morse Code (MC using an in house developed audio book as a model for language-type learning, we generated a well-controlled learning environment that allows for the detection of subtle white matter changes related to language type learning in a much shorter time frame than usual language acquisition. In total 12 letters of the MC alphabet were learned within six learning session, which allowed study participants to perform a word recognition MC decoding task. In this study, we found that learning MC was associated with significant microstructural changes in the left inferior longitudinal fasciculus (ILF. The fractional anisotropy (FA of this associative fiber bundle connecting the occipital and posterior temporal cortex with the temporal pole as well as the hippocampus and amygdala was increased. Furthermore, white matter plasticity was associated with task performance of MC decoding, indicating that the structural changes were related to learning efficiency. In conclusion, our findings demonstrate an important role of white matter neuroplasticity for acquiring a new language skill.

  10. The use of QR Code as a learning technology: an exploratory study

    Directory of Open Access Journals (Sweden)

    Stefano Besana

    2010-12-01

    Full Text Available This paper discusses a pilot study on the potential benefits of QR (Quick Response Codes as a tool for facilitating and enhancing learning processes. An analysis is given of the strengths and added value of QR technologies applied to museum visits, with precautions regarding the design of learning environments like the one presented. Some possible future scenarios are identified for implementing these technologies in contexts more strictly related to teaching and education.

  11. REM sleep modifications following a Morse code learning session in humans.

    Science.gov (United States)

    Mandai, O; Guerrien, A; Sockeel, P; Dujardin, K; Leconte, P

    1989-10-01

    Various experimental data indicate that rapid eye movement (REM) sleep is involved in learning processes. In animals, any complex task in a learning environment leads to an increase of the consecutive total REM sleep time, especially just before learning completion. In humans, the oculomotor activity during REM sleep seems to constitute an interesting marker of learning performance. In this work, we focus on the qualitative analysis of REM sleep characteristics after a Morse code learning session. Eight male subjects were polygraphically recorded during three consecutive nights. A computer aided teaching session was performed just before bedrest onset of the experimental night. The learning performance (percentage of saving) was checked on awakening. The Morse code learning led to some modifications in REM sleep components, particularly increases of REM sleep time and number of REM episodes. We did not observe any significant modification in the total number of REMs in the experimental night. However, the correlative analysis between learning performance and sleep parameters indicates a superior r for the oculomotor activity than for the tonic components. This is consistent with the information processing hypothesis in which the temporal distribution of REMs reflects the subject's ability to increase the signal-noise ratio from environmental information intake.

  12. Side Information and Noise Learning for Distributed Video Coding using Optical Flow and Clustering

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Rakêt, Lars Lau; Huang, Xin

    2012-01-01

    information for (LDPCA) decoding. The proposed decoder side techniques for side information and noise learning (SING) are integrated in a TDWZ scheme. On test sequences, the proposed SING codec robustly improves the coding efficiency of TDWZ DVC. For WZ frames using a GOP size of 2, up to 4dB improvement...

  13. Progressive Dictionary Learning with Hierarchical Predictive Structure for Scalable Video Coding.

    Science.gov (United States)

    Dai, Wenrui; Shen, Yangmei; Xiong, Hongkai; Jiang, Xiaoqian; Zou, Junni; Taubman, David

    2017-04-12

    Dictionary learning has emerged as a promising alternative to the conventional hybrid coding framework. However, the rigid structure of sequential training and prediction degrades its performance in scalable video coding. This paper proposes a progressive dictionary learning framework with hierarchical predictive structure for scalable video coding, especially in low bitrate region. For pyramidal layers, sparse representation based on spatio-temporal dictionary is adopted to improve the coding efficiency of enhancement layers (ELs) with a guarantee of reconstruction performance. The overcomplete dictionary is trained to adaptively capture local structures along motion trajectories as well as exploit the correlations between neighboring layers of resolutions. Furthermore, progressive dictionary learning is developed to enable the scalability in temporal domain and restrict the error propagation in a close-loop predictor. Under the hierarchical predictive structure, online learning is leveraged to guarantee the training and prediction performance with an improved convergence rate. To accommodate with the stateof- the-art scalable extension of H.264/AVC and latest HEVC, standardized codec cores are utilized to encode the base and enhancement layers. Experimental results show that the proposed method outperforms the latest SHVC and HEVC simulcast over extensive test sequences with various resolutions.

  14. Teacher Candidates Implementing Universal Design for Learning: Enhancing Picture Books with QR Codes

    Science.gov (United States)

    Grande, Marya; Pontrello, Camille

    2016-01-01

    The purpose of this study was to investigate if teacher candidates could gain knowledge of the principles of Universal Design for Learning by enhancing traditional picture books with Quick Response (QR) codes and to determine if the process of making these enhancements would impact teacher candidates' comfort levels with using technology on both…

  15. Using supervised machine learning to code policy issues: Can classifiers generalize across contexts?

    NARCIS (Netherlands)

    Burscher, B.; Vliegenthart, R.; de Vreese, C.H.

    2015-01-01

    Content analysis of political communication usually covers large amounts of material and makes the study of dynamics in issue salience a costly enterprise. In this article, we present a supervised machine learning approach for the automatic coding of policy issues, which we apply to news articles

  16. A Software Tutorial for Learning the Nemeth Code of Braille Mathematics.

    Science.gov (United States)

    Kapperman, Gaylen; Sticken, Jodi

    2002-01-01

    This article describes a software tutorial that can be used by people who are blind to learn the Nemeth Code of Braille mathematics notation. The program was designed for use with the Braille Lite, a note taker that has speech and a refreshable Braille display. The tutorial has 18 lessons. (CR)

  17. Assessment of Programming Language Learning Based on Peer Code Review Model: Implementation and Experience Report

    Science.gov (United States)

    Wang, Yanqing; Li, Hang; Feng, Yuqiang; Jiang, Yu; Liu, Ying

    2012-01-01

    The traditional assessment approach, in which one single written examination counts toward a student's total score, no longer meets new demands of programming language education. Based on a peer code review process model, we developed an online assessment system called "EduPCR" and used a novel approach to assess the learning of computer…

  18. Machine-learning-assisted correction of correlated qubit errors in a topological code

    Directory of Open Access Journals (Sweden)

    Paul Baireuther

    2018-01-01

    Full Text Available A fault-tolerant quantum computation requires an efficient means to detect and correct errors that accumulate in encoded quantum information. In the context of machine learning, neural networks are a promising new approach to quantum error correction. Here we show that a recurrent neural network can be trained, using only experimentally accessible data, to detect errors in a widely used topological code, the surface code, with a performance above that of the established minimum-weight perfect matching (or blossom decoder. The performance gain is achieved because the neural network decoder can detect correlations between bit-flip (X and phase-flip (Z errors. The machine learning algorithm adapts to the physical system, hence no noise model is needed. The long short-term memory layers of the recurrent neural network maintain their performance over a large number of quantum error correction cycles, making it a practical decoder for forthcoming experimental realizations of the surface code.

  19. Prefrontal Goal Codes Emerge as Latent States in Probabilistic Value Learning.

    Science.gov (United States)

    Stoianov, Ivilin; Genovesio, Aldo; Pezzulo, Giovanni

    2016-01-01

    The prefrontal cortex (PFC) supports goal-directed actions and exerts cognitive control over behavior, but the underlying coding and mechanism are heavily debated. We present evidence for the role of goal coding in PFC from two converging perspectives: computational modeling and neuronal-level analysis of monkey data. We show that neural representations of prospective goals emerge by combining a categorization process that extracts relevant behavioral abstractions from the input data and a reward-driven process that selects candidate categories depending on their adaptive value; both forms of learning have a plausible neural implementation in PFC. Our analyses demonstrate a fundamental principle: goal coding represents an efficient solution to cognitive control problems, analogous to efficient coding principles in other (e.g., visual) brain areas. The novel analytical-computational approach is of general interest because it applies to a variety of neurophysiological studies.

  20. Unsupervised Transfer Learning via Multi-Scale Convolutional Sparse Coding for Biomedical Applications.

    Science.gov (United States)

    Chang, Hang; Han, Ju; Zhong, Cheng; Snijders, Antoine M; Mao, Jian-Hua

    2018-05-01

    The capabilities of (I) learning transferable knowledge across domains; and (II) fine-tuning the pre-learned base knowledge towards tasks with considerably smaller data scale are extremely important. Many of the existing transfer learning techniques are supervised approaches, among which deep learning has the demonstrated power of learning domain transferrable knowledge with large scale network trained on massive amounts of labeled data. However, in many biomedical tasks, both the data and the corresponding label can be very limited, where the unsupervised transfer learning capability is urgently needed. In this paper, we proposed a novel multi-scale convolutional sparse coding (MSCSC) method, that (I) automatically learns filter banks at different scales in a joint fashion with enforced scale-specificity of learned patterns; and (II) provides an unsupervised solution for learning transferable base knowledge and fine-tuning it towards target tasks. Extensive experimental evaluation of MSCSC demonstrates the effectiveness of the proposed MSCSC in both regular and transfer learning tasks in various biomedical domains.

  1. Stitching Codeable Circuits: High School Students' Learning About Circuitry and Coding with Electronic Textiles

    Science.gov (United States)

    Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.

    2017-10-01

    Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light outputs and examining how the two domains interact. We implemented an electronic textiles unit with 23 high school students ages 16-17 years who learned how to craft and code circuits with the LilyPad Arduino, an electronic textile construction kit. Our analyses not only confirm significant increases in students' understanding of functional circuits but also showcase students' ability in designing and remixing program code for controlling circuits. In our discussion, we address opportunities and challenges of introducing codeable circuit design for integrating maker activities that include engineering and computing into classrooms.

  2. Software Quality and Security in Teachers' and Students' Codes When Learning a New Programming Language

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2015-09-01

    Full Text Available In recent years, schools (as well as universities have added cyber security to their computer science curricula. This topic is still new for most of the current teachers, who would normally have a standard computer science background. Therefore the teachers are trained and then teaching their students what they have just learned. In order to explore differences in both populations’ learning, we compared measures of software quality and security between high-school teachers and students. We collected 109 source files, written in Python by 18 teachers and 31 students, and engineered 32 features, based on common standards for software quality (PEP 8 and security (derived from CERT Secure Coding Standards. We use a multi-view, data-driven approach, by (a using hierarchical clustering to bottom-up partition the population into groups based on their code-related features and (b building a decision tree model that predicts whether a student or a teacher wrote a given code (resulting with a LOOCV kappa of 0.751. Overall, our findings suggest that the teachers’ codes have a better quality than the students’ – with a sub-group of the teachers, mostly males, demonstrate better coding than their peers and the students – and that the students’ codes are slightly better secured than the teachers’ codes (although both populations show very low security levels. The findings imply that teachers might benefit from their prior knowledge and experience, but also emphasize the lack of continuous involvement of some of the teachers with code-writing. Therefore, findings shed light on computer science teachers as lifelong learners. Findings also highlight the difference between quality and security in today’s programming paradigms. Implications for these findings are discussed.

  3. Predictive coding accelerates word recognition and learning in the early stages of language development.

    Science.gov (United States)

    Ylinen, Sari; Bosseler, Alexis; Junttila, Katja; Huotilainen, Minna

    2017-11-01

    The ability to predict future events in the environment and learn from them is a fundamental component of adaptive behavior across species. Here we propose that inferring predictions facilitates speech processing and word learning in the early stages of language development. Twelve- and 24-month olds' electrophysiological brain responses to heard syllables are faster and more robust when the preceding word context predicts the ending of a familiar word. For unfamiliar, novel word forms, however, word-expectancy violation generates a prediction error response, the strength of which significantly correlates with children's vocabulary scores at 12 months. These results suggest that predictive coding may accelerate word recognition and support early learning of novel words, including not only the learning of heard word forms but also their mapping to meanings. Prediction error may mediate learning via attention, since infants' attention allocation to the entire learning situation in natural environments could account for the link between prediction error and the understanding of word meanings. On the whole, the present results on predictive coding support the view that principles of brain function reported across domains in humans and non-human animals apply to language and its development in the infant brain. A video abstract of this article can be viewed at: http://hy.fi/unitube/video/e1cbb495-41d8-462e-8660-0864a1abd02c. [Correction added on 27 January 2017, after first online publication: The video abstract link was added.]. © 2016 John Wiley & Sons Ltd.

  4. Imitation learning based on an intrinsic motivation mechanism for efficient coding.

    Science.gov (United States)

    Triesch, Jochen

    2013-01-01

    A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML) for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation.

  5. Imitation Learning Based on an Intrinsic Motivation Mechanism for Efficient Coding

    Directory of Open Access Journals (Sweden)

    Jochen eTriesch

    2013-11-01

    Full Text Available A hypothesis regarding the development of imitation learning is presented that is rooted in intrinsic motivations. It is derived from a recently proposed form of intrinsically motivated learning (IML for efficient coding in active perception, wherein an agent learns to perform actions with its sense organs to facilitate efficient encoding of the sensory data. To this end, actions of the sense organs that improve the encoding of the sensory data trigger an internally generated reinforcement signal. Here it is argued that the same IML mechanism might also support the development of imitation when general actions beyond those of the sense organs are considered: The learner first observes a tutor performing a behavior and learns a model of the the behavior's sensory consequences. The learner then acts itself and receives an internally generated reinforcement signal reflecting how well the sensory consequences of its own behavior are encoded by the sensory model. Actions that are more similar to those of the tutor will lead to sensory signals that are easier to encode and produce a higher reinforcement signal. Through this, the learner's behavior is progressively tuned to make the sensory consequences of its actions match the learned sensory model. I discuss this mechanism in the context of human language acquisition and bird song learning where similar ideas have been proposed. The suggested mechanism also offers an account for the development of mirror neurons and makes a number of predictions. Overall, it establishes a connection between principles of efficient coding, intrinsic motivations and imitation.

  6. Cross-domain expression recognition based on sparse coding and transfer learning

    Science.gov (United States)

    Yang, Yong; Zhang, Weiyi; Huang, Yong

    2017-05-01

    Traditional facial expression recognition methods usually assume that the training set and the test set are independent and identically distributed. However, in actual expression recognition applications, the conditions of independent and identical distribution are hardly satisfied for the training set and test set because of the difference of light, shade, race and so on. In order to solve this problem and improve the performance of expression recognition in the actual applications, a novel method based on transfer learning and sparse coding is applied to facial expression recognition. First of all, a common primitive model, that is, the dictionary is learnt. Then, based on the idea of transfer learning, the learned primitive pattern is transferred to facial expression and the corresponding feature representation is obtained by sparse coding. The experimental results in CK +, JAFFE and NVIE database shows that the transfer learning based on sparse coding method can effectively improve the expression recognition rate in the cross-domain expression recognition task and is suitable for the practical facial expression recognition applications.

  7. Accuracy comparison among different machine learning techniques for detecting malicious codes

    Science.gov (United States)

    Narang, Komal

    2016-03-01

    In this paper, a machine learning based model for malware detection is proposed. It can detect newly released malware i.e. zero day attack by analyzing operation codes on Android operating system. The accuracy of Naïve Bayes, Support Vector Machine (SVM) and Neural Network for detecting malicious code has been compared for the proposed model. In the experiment 400 benign files, 100 system files and 500 malicious files have been used to construct the model. The model yields the best accuracy 88.9% when neural network is used as classifier and achieved 95% and 82.8% accuracy for sensitivity and specificity respectively.

  8. Paired associate learning of Morse code and Braille letter names by dyslexic and normal children.

    Science.gov (United States)

    Rudel, R G; Denckla, M B; Spalten, E

    1976-03-01

    Twenty dyslexic and twenty normal children, matched for age and sex and with the same mean I.Q. were tested on their ability to learn letter names of Braille configurations presented visually or tactually and to Morse Code signals presented aurally. The dyslexic Ss learned fewer letters in all three modalities although for both groups the visual-verbal method was easiest. The deficits were not attributable to specific modality dysfunction nor to a failure of intersensory integration. More general encoding and retrieval difficulties appear to be implicated.

  9. Neural coding of basic reward terms of animal learning theory, game theory, microeconomics and behavioural ecology.

    Science.gov (United States)

    Schultz, Wolfram

    2004-04-01

    Neurons in a small number of brain structures detect rewards and reward-predicting stimuli and are active during the expectation of predictable food and liquid rewards. These neurons code the reward information according to basic terms of various behavioural theories that seek to explain reward-directed learning, approach behaviour and decision-making. The involved brain structures include groups of dopamine neurons, the striatum including the nucleus accumbens, the orbitofrontal cortex and the amygdala. The reward information is fed to brain structures involved in decision-making and organisation of behaviour, such as the dorsolateral prefrontal cortex and possibly the parietal cortex. The neural coding of basic reward terms derived from formal theories puts the neurophysiological investigation of reward mechanisms on firm conceptual grounds and provides neural correlates for the function of rewards in learning, approach behaviour and decision-making.

  10. A new method for species identification via protein-coding and non-coding DNA barcodes by combining machine learning with bioinformatic methods.

    Directory of Open Access Journals (Sweden)

    Ai-bing Zhang

    Full Text Available Species identification via DNA barcodes is contributing greatly to current bioinventory efforts. The initial, and widely accepted, proposal was to use the protein-coding cytochrome c oxidase subunit I (COI region as the standard barcode for animals, but recently non-coding internal transcribed spacer (ITS genes have been proposed as candidate barcodes for both animals and plants. However, achieving a robust alignment for non-coding regions can be problematic. Here we propose two new methods (DV-RBF and FJ-RBF to address this issue for species assignment by both coding and non-coding sequences that take advantage of the power of machine learning and bioinformatics. We demonstrate the value of the new methods with four empirical datasets, two representing typical protein-coding COI barcode datasets (neotropical bats and marine fish and two representing non-coding ITS barcodes (rust fungi and brown algae. Using two random sub-sampling approaches, we demonstrate that the new methods significantly outperformed existing Neighbor-joining (NJ and Maximum likelihood (ML methods for both coding and non-coding barcodes when there was complete species coverage in the reference dataset. The new methods also out-performed NJ and ML methods for non-coding sequences in circumstances of potentially incomplete species coverage, although then the NJ and ML methods performed slightly better than the new methods for protein-coding barcodes. A 100% success rate of species identification was achieved with the two new methods for 4,122 bat queries and 5,134 fish queries using COI barcodes, with 95% confidence intervals (CI of 99.75-100%. The new methods also obtained a 96.29% success rate (95%CI: 91.62-98.40% for 484 rust fungi queries and a 98.50% success rate (95%CI: 96.60-99.37% for 1094 brown algae queries, both using ITS barcodes.

  11. Teaching the computer to code frames in news: comparing two supervised machine learning approaches to frame analysis

    NARCIS (Netherlands)

    Burscher, B.; Odijk, D.; Vliegenthart, R.; de Rijke, M.; de Vreese, C.H.

    2014-01-01

    We explore the application of supervised machine learning (SML) to frame coding. By automating the coding of frames in news, SML facilitates the incorporation of large-scale content analysis into framing research, even if financial resources are scarce. This furthers a more integrated investigation

  12. A computational study on altered theta-gamma coupling during learning and phase coding.

    Directory of Open Access Journals (Sweden)

    Xuejuan Zhang

    Full Text Available There is considerable interest in the role of coupling between theta and gamma oscillations in the brain in the context of learning and memory. Here we have used a neural network model which is capable of producing coupling of theta phase to gamma amplitude firstly to explore its ability to reproduce reported learning changes and secondly to memory-span and phase coding effects. The spiking neural network incorporates two kinetically different GABA(A receptor-mediated currents to generate both theta and gamma rhythms and we have found that by selective alteration of both NMDA receptors and GABA(A,slow receptors it can reproduce learning-related changes in the strength of coupling between theta and gamma either with or without coincident changes in theta amplitude. When the model was used to explore the relationship between theta and gamma oscillations, working memory capacity and phase coding it showed that the potential storage capacity of short term memories, in terms of nested gamma-subcycles, coincides with the maximal theta power. Increasing theta power is also related to the precision of theta phase which functions as a potential timing clock for neuronal firing in the cortex or hippocampus.

  13. Code to Learn: Where Does It Belong in the K-12 Curriculum?

    Directory of Open Access Journals (Sweden)

    Jesús Moreno León

    2016-06-01

    Full Text Available The introduction of computer programming in K-12 has become mainstream in the last years, as countries around the world are making coding part of their curriculum. Nevertheless, there is a lack of empirical studies that investigate how learning to program at an early age affects other school subjects. In this regard, this paper compares three quasi-experimental research designs conducted in three different schools (n=129 students from 2nd and 6th grade, in order to assess the impact of introducing programming with Scratch at different stages and in several subjects. While both 6th grade experimental groups working with coding activities showed a statistically significant improvement in terms of academic performance, this was not the case in the 2nd grade classroom. Notable disparity was also found regarding the subject in which the programming activities were included, as in social studies the effect size was double that in mathematics.

  14. QR Codes as Mobile Learning Tools for Labor Room Nurses at the San Pablo Colleges Medical Center

    Science.gov (United States)

    Del Rosario-Raymundo, Maria Rowena

    2017-01-01

    Purpose: The purpose of this paper is to explore the use of QR codes as mobile learning tools and examine factors that impact on their usefulness, acceptability and feasibility in assisting the nurses' learning. Design/Methodology/Approach: Study participants consisted of 14 regular, full-time, board-certified LR nurses. Over a two-week period,…

  15. Learning about Probability from Text and Tables: Do Color Coding and Labeling through an Interactive-User Interface Help?

    Science.gov (United States)

    Clinton, Virginia; Morsanyi, Kinga; Alibali, Martha W.; Nathan, Mitchell J.

    2016-01-01

    Learning from visual representations is enhanced when learners appropriately integrate corresponding visual and verbal information. This study examined the effects of two methods of promoting integration, color coding and labeling, on learning about probabilistic reasoning from a table and text. Undergraduate students (N = 98) were randomly…

  16. [Transposition errors during learning to reproduce a sequence by the right- and the left-hand movements: simulation of positional and movement coding].

    Science.gov (United States)

    Liakhovetskiĭ, V A; Bobrova, E V; Skopin, G N

    2012-01-01

    Transposition errors during the reproduction of a hand movement sequence make it possible to receive important information on the internal representation of this sequence in the motor working memory. Analysis of such errors showed that learning to reproduce sequences of the left-hand movements improves the system of positional coding (coding ofpositions), while learning of the right-hand movements improves the system of vector coding (coding of movements). Learning of the right-hand movements after the left-hand performance involved the system of positional coding "imposed" by the left hand. Learning of the left-hand movements after the right-hand performance activated the system of vector coding. Transposition errors during learning to reproduce movement sequences can be explained by neural network using either vector coding or both vector and positional coding.

  17. Segmentation of MR images via discriminative dictionary learning and sparse coding: application to hippocampus labeling.

    Science.gov (United States)

    Tong, Tong; Wolz, Robin; Coupé, Pierrick; Hajnal, Joseph V; Rueckert, Daniel

    2013-08-01

    We propose a novel method for the automatic segmentation of brain MRI images by using discriminative dictionary learning and sparse coding techniques. In the proposed method, dictionaries and classifiers are learned simultaneously from a set of brain atlases, which can then be used for the reconstruction and segmentation of an unseen target image. The proposed segmentation strategy is based on image reconstruction, which is in contrast to most existing atlas-based labeling approaches that rely on comparing image similarities between atlases and target images. In addition, we propose a Fixed Discriminative Dictionary Learning for Segmentation (F-DDLS) strategy, which can learn dictionaries offline and perform segmentations online, enabling a significant speed-up in the segmentation stage. The proposed method has been evaluated for the hippocampus segmentation of 80 healthy ICBM subjects and 202 ADNI images. The robustness of the proposed method, especially of our F-DDLS strategy, was validated by training and testing on different subject groups in the ADNI database. The influence of different parameters was studied and the performance of the proposed method was also compared with that of the nonlocal patch-based approach. The proposed method achieved a median Dice coefficient of 0.879 on 202 ADNI images and 0.890 on 80 ICBM subjects, which is competitive compared with state-of-the-art methods. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Improvement of Self-regulated Learning in Mathematics through a Hypermedia Application: Differences based on Academic Performance and Previous Knowledge.

    Science.gov (United States)

    Cueli, Marisol; Rodríguez, Celestino; Areces, Débora; García, Trinidad; González-Castro, Paloma

    2017-12-04

    Self-regulation on behalf of the student is crucial in learning Mathematics through hypermedia applications and is an even greater challenge in these IT environments. Two aims are formulated. First, to analyze the effectiveness of a hypermedia tool in improving perceived knowledge of self-regulatory strategies and the perceived usage of the planning, executing and assessment strategy on behalf of students with low, medium and high levels of academic performance. Second, to analyze the effectiveness of the hypermedia tool in improving perceived usage of the strategy for planning, monitoring and evaluating on behalf of students with a perceived knowledge (low, medium and high). Participants were 624 students (aged 10-13), classified into a treatment group (TG; 391) and a comparative group (CG; 233). They completed a questionnaire on perceived knowledge (Perceived Knowledge of Self-Regulatory Strategies) and another one on perceived usage of the strategy for planning, performing and evaluating (Inventory of Self-regulatory Learning Processes). Univariate covariance analyses (ANCOVAs) and Student-t tests were used. ANCOVA results were not statistically significant. However, the linear contrast indicated a significant improvement in perceived knowledge of strategies among the TG with low, medium and high academic performance (p ≤ .001). Results are discussed in the light of past and future research.

  19. Uncertainty Quantification and Learning in Geophysical Modeling: How Information is Coded into Dynamical Models

    Science.gov (United States)

    Gupta, H. V.

    2014-12-01

    There is a clear need for comprehensive quantification of simulation uncertainty when using geophysical models to support and inform decision-making. Further, it is clear that the nature of such uncertainty depends on the quality of information in (a) the forcing data (driver information), (b) the model code (prior information), and (c) the specific values of inferred model components that localize the model to the system of interest (inferred information). Of course, the relative quality of each varies with geophysical discipline and specific application. In this talk I will discuss a structured approach to characterizing how 'Information', and hence 'Uncertainty', is coded into the structures of physics-based geophysical models. I propose that a better understanding of what is meant by "Information", and how it is embodied in models and data, can offer a structured (less ad-hoc), robust and insightful basis for diagnostic learning through the model-data juxtaposition. In some fields, a natural consequence may be to emphasize the a priori role of System Architecture (Process Modeling) over that of the selection of System Parameterization, thereby emphasizing the more creative aspect of scientific investigation - the use of models for Discovery and Learning.

  20. Analysis of image content recognition algorithm based on sparse coding and machine learning

    Science.gov (United States)

    Xiao, Yu

    2017-03-01

    This paper presents an image classification algorithm based on spatial sparse coding model and random forest. Firstly, SIFT feature extraction of the image; and then use the sparse encoding theory to generate visual vocabulary based on SIFT features, and using the visual vocabulary of SIFT features into a sparse vector; through the combination of regional integration and spatial sparse vector, the sparse vector gets a fixed dimension is used to represent the image; at last random forest classifier for image sparse vectors for training and testing, using the experimental data set for standard test Caltech-101 and Scene-15. The experimental results show that the proposed algorithm can effectively represent the features of the image and improve the classification accuracy. In this paper, we propose an innovative image recognition algorithm based on image segmentation, sparse coding and multi instance learning. This algorithm introduces the concept of multi instance learning, the image as a multi instance bag, sparse feature transformation by SIFT images as instances, sparse encoding model generation visual vocabulary as the feature space is mapped to the feature space through the statistics on the number of instances in bags, and then use the 1-norm SVM to classify images and generate sample weights to select important image features.

  1. Single-image super-resolution reconstruction via learned geometric dictionaries and clustered sparse coding.

    Science.gov (United States)

    Yang, Shuyuan; Wang, Min; Chen, Yiguang; Sun, Yaxin

    2012-09-01

    Recently, single image super-resolution reconstruction (SISR) via sparse coding has attracted increasing interest. In this paper, we proposed a multiple-geometric-dictionaries-based clustered sparse coding scheme for SISR. Firstly, a large number of high-resolution (HR) image patches are randomly extracted from a set of example training images and clustered into several groups of "geometric patches," from which the corresponding "geometric dictionaries" are learned to further sparsely code each local patch in a low-resolution image. A clustering aggregation is performed on the HR patches recovered by different dictionaries, followed by a subsequent patch aggregation to estimate the HR image. Considering that there are often many repetitive image structures in an image, we add a self-similarity constraint on the recovered image in patch aggregation to reveal new features and details. Finally, the HR residual image is estimated by the proposed recovery method and compensated to better preserve the subtle details of the images. Some experiments test the proposed method on natural images, and the results show that the proposed method outperforms its counterparts in both visual fidelity and numerical measures.

  2. Learning dictionaries of sparse codes of 3D movements of body joints for real-time human activity understanding.

    Science.gov (United States)

    Qi, Jin; Yang, Zhiyong

    2014-01-01

    Real-time human activity recognition is essential for human-robot interactions for assisted healthy independent living. Most previous work in this area is performed on traditional two-dimensional (2D) videos and both global and local methods have been used. Since 2D videos are sensitive to changes of lighting condition, view angle, and scale, researchers begun to explore applications of 3D information in human activity understanding in recently years. Unfortunately, features that work well on 2D videos usually don't perform well on 3D videos and there is no consensus on what 3D features should be used. Here we propose a model of human activity recognition based on 3D movements of body joints. Our method has three steps, learning dictionaries of sparse codes of 3D movements of joints, sparse coding, and classification. In the first step, space-time volumes of 3D movements of body joints are obtained via dense sampling and independent component analysis is then performed to construct a dictionary of sparse codes for each activity. In the second step, the space-time volumes are projected to the dictionaries and a set of sparse histograms of the projection coefficients are constructed as feature representations of the activities. Finally, the sparse histograms are used as inputs to a support vector machine to recognize human activities. We tested this model on three databases of human activities and found that it outperforms the state-of-the-art algorithms. Thus, this model can be used for real-time human activity recognition in many applications.

  3. Learning dictionaries of sparse codes of 3D movements of body joints for real-time human activity understanding.

    Directory of Open Access Journals (Sweden)

    Jin Qi

    Full Text Available Real-time human activity recognition is essential for human-robot interactions for assisted healthy independent living. Most previous work in this area is performed on traditional two-dimensional (2D videos and both global and local methods have been used. Since 2D videos are sensitive to changes of lighting condition, view angle, and scale, researchers begun to explore applications of 3D information in human activity understanding in recently years. Unfortunately, features that work well on 2D videos usually don't perform well on 3D videos and there is no consensus on what 3D features should be used. Here we propose a model of human activity recognition based on 3D movements of body joints. Our method has three steps, learning dictionaries of sparse codes of 3D movements of joints, sparse coding, and classification. In the first step, space-time volumes of 3D movements of body joints are obtained via dense sampling and independent component analysis is then performed to construct a dictionary of sparse codes for each activity. In the second step, the space-time volumes are projected to the dictionaries and a set of sparse histograms of the projection coefficients are constructed as feature representations of the activities. Finally, the sparse histograms are used as inputs to a support vector machine to recognize human activities. We tested this model on three databases of human activities and found that it outperforms the state-of-the-art algorithms. Thus, this model can be used for real-time human activity recognition in many applications.

  4. Effects of Mode of Target Task Selection on Learning about Plants in a Mobile Learning Environment: Effortful Manual Selection versus Effortless QR-Code Selection

    Science.gov (United States)

    Gao, Yuan; Liu, Tzu-Chien; Paas, Fred

    2016-01-01

    This study compared the effects of effortless selection of target plants using quick respond (QR) code technology to effortful manual search and selection of target plants on learning about plants in a mobile device supported learning environment. In addition, it was investigated whether the effectiveness of the 2 selection methods was…

  5. Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

    Directory of Open Access Journals (Sweden)

    Eduardo Coutinho

    Full Text Available Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech and cross-domain experiments (i.e., models trained in one modality and tested on the other. In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.

  6. Shared acoustic codes underlie emotional communication in music and speech-Evidence from deep transfer learning.

    Science.gov (United States)

    Coutinho, Eduardo; Schuller, Björn

    2017-01-01

    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies-the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain.

  7. Shared acoustic codes underlie emotional communication in music and speech—Evidence from deep transfer learning

    Science.gov (United States)

    Schuller, Björn

    2017-01-01

    Music and speech exhibit striking similarities in the communication of emotions in the acoustic domain, in such a way that the communication of specific emotions is achieved, at least to a certain extent, by means of shared acoustic patterns. From an Affective Sciences points of view, determining the degree of overlap between both domains is fundamental to understand the shared mechanisms underlying such phenomenon. From a Machine learning perspective, the overlap between acoustic codes for emotional expression in music and speech opens new possibilities to enlarge the amount of data available to develop music and speech emotion recognition systems. In this article, we investigate time-continuous predictions of emotion (Arousal and Valence) in music and speech, and the Transfer Learning between these domains. We establish a comparative framework including intra- (i.e., models trained and tested on the same modality, either music or speech) and cross-domain experiments (i.e., models trained in one modality and tested on the other). In the cross-domain context, we evaluated two strategies—the direct transfer between domains, and the contribution of Transfer Learning techniques (feature-representation-transfer based on Denoising Auto Encoders) for reducing the gap in the feature space distributions. Our results demonstrate an excellent cross-domain generalisation performance with and without feature representation transfer in both directions. In the case of music, cross-domain approaches outperformed intra-domain models for Valence estimation, whereas for Speech intra-domain models achieve the best performance. This is the first demonstration of shared acoustic codes for emotional expression in music and speech in the time-continuous domain. PMID:28658285

  8. Impact of School Uniforms on Student Discipline and the Learning Climate: A Comparative Case Study of Two Middle Schools with Uniform Dress Codes and Two Middle Schools without Uniform Dress Codes

    Science.gov (United States)

    Dulin, Charles Dewitt

    2016-01-01

    The purpose of this research is to evaluate the impact of uniform dress codes on a school's climate for student behavior and learning in four middle schools in North Carolina. The research will compare the perceptions of parents, teachers, and administrators in schools with uniform dress codes against schools without uniform dress codes. This…

  9. Artificial Intelligence Learning Semantics via External Resources for Classifying Diagnosis Codes in Discharge Notes.

    Science.gov (United States)

    Lin, Chin; Hsu, Chia-Jung; Lou, Yu-Sheng; Yeh, Shih-Jen; Lee, Chia-Cheng; Su, Sui-Lung; Chen, Hsiang-Cheng

    2017-11-06

    Automated disease code classification using free-text medical information is important for public health surveillance. However, traditional natural language processing (NLP) pipelines are limited, so we propose a method combining word embedding with a convolutional neural network (CNN). Our objective was to compare the performance of traditional pipelines (NLP plus supervised machine learning models) with that of word embedding combined with a CNN in conducting a classification task identifying International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) diagnosis codes in discharge notes. We used 2 classification methods: (1) extracting from discharge notes some features (terms, n-gram phrases, and SNOMED CT categories) that we used to train a set of supervised machine learning models (support vector machine, random forests, and gradient boosting machine), and (2) building a feature matrix, by a pretrained word embedding model, that we used to train a CNN. We used these methods to identify the chapter-level ICD-10-CM diagnosis codes in a set of discharge notes. We conducted the evaluation using 103,390 discharge notes covering patients hospitalized from June 1, 2015 to January 31, 2017 in the Tri-Service General Hospital in Taipei, Taiwan. We used the receiver operating characteristic curve as an evaluation measure, and calculated the area under the curve (AUC) and F-measure as the global measure of effectiveness. In 5-fold cross-validation tests, our method had a higher testing accuracy (mean AUC 0.9696; mean F-measure 0.9086) than traditional NLP-based approaches (mean AUC range 0.8183-0.9571; mean F-measure range 0.5050-0.8739). A real-world simulation that split the training sample and the testing sample by date verified this result (mean AUC 0.9645; mean F-measure 0.9003 using the proposed method). Further analysis showed that the convolutional layers of the CNN effectively identified a large number of keywords and automatically

  10. Learning to Act Like a Lawyer: A Model Code of Professional Responsibility for Law Students

    Directory of Open Access Journals (Sweden)

    David M. Tanovich

    2009-02-01

    Full Text Available Law students are the future of the legal profession. How well prepared are they when they leave law school to assume the professional and ethical obligations that they owe themselves, the profession and the public? This question has led to a growing interest in Canada in the teaching of legal ethics. It is also led to a greater emphasis on the development of clinical and experiential learning as exemplified in the scholarship and teaching of Professor Rose Voyvodic. Less attention, however, has been placed on identifying the general ethical responsibilities of law students when not working in a clinic or other legal context. This can be seen in the presence of very few Canadian articles exploring the issue, and more significantly, in the paucity of law school discipline policies or codes of conduct that set out the professional obligations owed by law students. This article develops an idea that Professor Voyvodic and I talked about on a number of occasions. It argues that all law schools should have a code of conduct which is separate and distinct from their general University code and which resembles, with appropriate modifications, the relevant set of rules of professional responsibility law students will be bound by when called to the Bar. A student code of conduct which educates law students about their professional obligations is an important step in deterring such conduct while in law school and preparing students for ethical practice. The idea of a law school code of professional responsibility raises a number of questions. Why is it necessary for law schools to have their own student code of conduct? The article provides a threefold response. First, law students are members of the legal profession and a code of conduct should reflect this. Second, it must be relevant and comprehensive in order to ensure that it can inspire students to be ethical lawyers. And, third, as a practical matter, the last few years have witnessed a number of

  11. Code-switching as a communication, learning, and social negotiation stategy in first-year learners of Danish

    DEFF Research Database (Denmark)

    Arnfast, Juni Søderberg; Jørgensen, Jens Normann

    2003-01-01

    The term code-switching is used in two related, yet different fields of linguistics: Second Language Acquisition and bilinguals studies. In the former code-switching is analyzed in terms of learning strategies, whereas the latter applies the competence view.The present paper intends to detect...... the borderline between the two concepts by investigating the use of code-switching in first year learners of Danish.Our study points out that code-switching appears as a skill used in early attempts of playing with the languages involved in the conversation (Danish/English and Danish/Polish/German/English). Thus...... we have to acknowledge code-switching as an increasingly sophisticated language skill even at an early stage of SLA....

  12. LeARN: a platform for detecting, clustering and annotating non-coding RNAs

    Directory of Open Access Journals (Sweden)

    Schiex Thomas

    2008-01-01

    Full Text Available Abstract Background In the last decade, sequencing projects have led to the development of a number of annotation systems dedicated to the structural and functional annotation of protein-coding genes. These annotation systems manage the annotation of the non-protein coding genes (ncRNAs in a very crude way, allowing neither the edition of the secondary structures nor the clustering of ncRNA genes into families which are crucial for appropriate annotation of these molecules. Results LeARN is a flexible software package which handles the complete process of ncRNA annotation by integrating the layers of automatic detection and human curation. Conclusion This software provides the infrastructure to deal properly with ncRNAs in the framework of any annotation project. It fills the gap between existing prediction software, that detect independent ncRNA occurrences, and public ncRNA repositories, that do not offer the flexibility and interactivity required for annotation projects. The software is freely available from the download section of the website http://bioinfo.genopole-toulouse.prd.fr/LeARN

  13. The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding.

    Science.gov (United States)

    Testolin, Alberto; De Filippo De Grazia, Michele; Zorzi, Marco

    2017-01-01

    The recent "deep learning revolution" in artificial neural networks had strong impact and widespread deployment for engineering applications, but the use of deep learning for neurocomputational modeling has been so far limited. In this article we argue that unsupervised deep learning represents an important step forward for improving neurocomputational models of perception and cognition, because it emphasizes the role of generative learning as opposed to discriminative (supervised) learning. As a case study, we present a series of simulations investigating the emergence of neural coding of visual space for sensorimotor transformations. We compare different network architectures commonly used as building blocks for unsupervised deep learning by systematically testing the type of receptive fields and gain modulation developed by the hidden neurons. In particular, we compare Restricted Boltzmann Machines (RBMs), which are stochastic, generative networks with bidirectional connections trained using contrastive divergence, with autoencoders, which are deterministic networks trained using error backpropagation. For both learning architectures we also explore the role of sparse coding, which has been identified as a fundamental principle of neural computation. The unsupervised models are then compared with supervised, feed-forward networks that learn an explicit mapping between different spatial reference frames. Our simulations show that both architectural and learning constraints strongly influenced the emergent coding of visual space in terms of distribution of tuning functions at the level of single neurons. Unsupervised models, and particularly RBMs, were found to more closely adhere to neurophysiological data from single-cell recordings in the primate parietal cortex. These results provide new insights into how basic properties of artificial neural networks might be relevant for modeling neural information processing in biological systems.

  14. Learning Concepts, Language, and Literacy in Hybrid Linguistic Codes: The Multilingual Maze of Urban Grade 1 Classrooms in South Africa

    Science.gov (United States)

    Henning, Elizabeth

    2012-01-01

    From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…

  15. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    Science.gov (United States)

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  16. Learning binary code via PCA of angle projection for image retrieval

    Science.gov (United States)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  17. Learning Joint-Sparse Codes for Calibration-Free Parallel MR Imaging.

    Science.gov (United States)

    Wang, Shanshan; Tan, Sha; Gao, Yuan; Liu, Qiegen; Ying, Leslie; Xiao, Taohui; Liu, Yuanyuan; Liu, Xin; Zheng, Hairong; Liang, Dong

    2018-01-01

    The integration of compressed sensing and parallel imaging (CS-PI) has shown an increased popularity in recent years to accelerate magnetic resonance (MR) imaging. Among them, calibration-free techniques have presented encouraging performances due to its capability in robustly handling the sensitivity information. Unfortunately, existing calibration-free methods have only explored joint-sparsity with direct analysis transform projections. To further exploit joint-sparsity and improve reconstruction accuracy, this paper proposes to Learn joINt-sparse coDes for caliBration-free parallEl mR imaGing (LINDBERG) by modeling the parallel MR imaging problem as an - - minimization objective with an norm constraining data fidelity, Frobenius norm enforcing sparse representation error and the mixed norm triggering joint sparsity across multichannels. A corresponding algorithm has been developed to alternatively update the sparse representation, sensitivity encoded images and K-space data. Then, the final image is produced as the square root of sum of squares of all channel images. Experimental results on both physical phantom and in vivo data sets show that the proposed method is comparable and even superior to state-of-the-art CS-PI reconstruction approaches. Specifically, LINDBERG has presented strong capability in suppressing noise and artifacts while reconstructing MR images from highly undersampled multichannel measurements.

  18. TERRESTRIAL LASER SCANNER DATA DENOISING BY DICTIONARY LEARNING OF SPARSE CODING

    Directory of Open Access Journals (Sweden)

    E. Smigiel

    2013-07-01

    Full Text Available Point cloud processing is basically a signal processing issue. The huge amount of data which are collected with Terrestrial Laser Scanners or photogrammetry techniques faces the classical questions linked with signal or image processing. Among others, denoising and compression are questions which have to be addressed in this context. That is why, one has to turn attention to signal theory because it is susceptible to guide one's good practices or to inspire new ideas from the latest developments of this field. The literature have been showing for decades how strong and dynamic, the theoretical field is and how efficient the derived algorithms have become. For about ten years, a new technique has appeared: known as compressive sensing or compressive sampling, it is based first on sparsity which is an interesting characteristic of many natural signals. Based on this concept, many denoising and compression techniques have shown their efficiencies. Sparsity can also be seen as redundancy removal of natural signals. Taken along with incoherent measurements, compressive sensing has appeared and uses the idea that redundancy could be removed at the very early stage of sampling. Hence, instead of sampling the signal at high sampling rate and removing redundancy as a second stage, the acquisition stage itself may be run with redundancy removal. This paper gives some theoretical aspects of these ideas with first simple mathematics. Then, the idea of compressive sensing for a Terrestrial Laser Scanner is examined as a potential research question and finally, a denoising scheme based on a dictionary learning of sparse coding is experienced. Both the theoretical discussion and the obtained results show that it is worth staying close to signal processing theory and its community to take benefit of its latest developments.

  19. Unsupervised learning of generative and discriminative weights encoding elementary image components in a predictive coding model of cortical function.

    Science.gov (United States)

    Spratling, M W

    2012-01-01

    A method is presented for learning the reciprocal feedforward and feedback connections required by the predictive coding model of cortical function. When this method is used, feedforward and feedback connections are learned simultaneously and independently in a biologically plausible manner. The performance of the proposed algorithm is evaluated by applying it to learning the elementary components of artificial and natural images. For artificial images, the bars problem is employed, and the proposed algorithm is shown to produce state-of-the-art performance on this task. For natural images, components resembling Gabor functions are learned in the first processing stage, and neurons responsive to corners are learned in the second processing stage. The properties of these learned representations are in good agreement with neurophysiological data from V1 and V2. The proposed algorithm demonstrates for the first time that a single computational theory can explain the formation of cortical RFs and also the response properties of cortical neurons once those RFs have been learned.

  20. Learning Morse Code Alters Microstructural Properties in the Inferior Longitudinal Fasciculus : A DTI Study

    NARCIS (Netherlands)

    Schlaffke, LV; Leemans, Alexander; Schweizer, Lauren M; Ocklenburg, Sebastian; Schmidt-Wilcke, Tobias

    2017-01-01

    Learning relies on neuroplasticity, which has mainly been studied in gray matter (GM). However, there is mounting evidence indicating a critical role of white matter changes involved in learning processes. One of the most important learning processes in human development is language acquisition.

  1. A National Report to Share the Experiences and Lessons Learned in the Implementation of the Code of Conduct on the Safety and Security of Radioactive Sources: The Philippines

    International Nuclear Information System (INIS)

    Borras, Alan M.; Palattao, Maria V.B.; Seguis, Julietta E.; Leonin, Teofilo V. Jr.; Rosa, Alumanda M. dela

    2015-01-01

    This national report aims to present the Philippines’ experiences and lessons learned by the Philippine Nuclear Research Institute (PNRI) in the implementation of the International Atomic Energy Agency (IAEA) “Code of Conduct on the Safety and Security of Radioactive Sources”. The successes and difficulties experienced by the Institute following its commitment to the Code of conduct are presented

  2. Criminal procedure code and charter of criminal proceedings in terrorism investigation: learning from the past

    OpenAIRE

    Makarov M.A.; Smakhtin E.V.

    2014-01-01

    Basic procedural institutions of terrorism investigation are studied by comparing the provisions of the Charter of criminal proceedings of the Russian Empire in 1864 and the current procedural law. The comparative solutions to the following investigation problems by Charter and the Code are shown: 1) terrorists confess to less serious crimes, representing themselves as accomplices (articles 208, 222 of the RF Criminal Code) to avoid punishment for terrorism (article 205 of the RF Criminal Cod...

  3. Lessons learned from new construction utility demand side management programs and their implications for implementing building energy codes

    Energy Technology Data Exchange (ETDEWEB)

    Wise, B.K.; Hughes, K.R.; Danko, S.L.; Gilbride, T.L.

    1994-07-01

    This report was prepared for the US Department of Energy (DOE) Office of Codes and Standards by the Pacific Northwest Laboratory (PNL) through its Building Energy Standards Program (BESP). The purpose of this task was to identify demand-side management (DSM) strategies for new construction that utilities have adopted or developed to promote energy-efficient design and construction. PNL conducted a survey of utilities and used the information gathered to extrapolate lessons learned and to identify evolving trends in utility new-construction DSM programs. The ultimate goal of the task is to identify opportunities where states might work collaboratively with utilities to promote the adoption, implementation, and enforcement of energy-efficient building energy codes.

  4. Predictive Coding Accelerates Word Recognition and Learning in the Early Stages of Language Development

    Science.gov (United States)

    Ylinen, Sari; Bosseler, Alexis; Junttila, Katja; Huotilainen, Minna

    2017-01-01

    The ability to predict future events in the environment and learn from them is a fundamental component of adaptive behavior across species. Here we propose that inferring predictions facilitates speech processing and word learning in the early stages of language development. Twelve- and 24-month olds' electrophysiological brain responses to heard…

  5. Stitching Codeable Circuits: High School Students' Learning about Circuitry and Coding with Electronic Textiles

    Science.gov (United States)

    Litts, Breanne K.; Kafai, Yasmin B.; Lui, Debora A.; Walker, Justice T.; Widman, Sari A.

    2017-01-01

    Learning about circuitry by connecting a battery, light bulb, and wires is a common activity in many science classrooms. In this paper, we expand students' learning about circuitry with electronic textiles, which use conductive thread instead of wires and sewable LEDs instead of lightbulbs, by integrating programming sensor inputs and light…

  6. Instance-based Policy Learning by Real-coded Genetic Algorithms and Its Application to Control of Nonholonomic Systems

    Science.gov (United States)

    Miyamae, Atsushi; Sakuma, Jun; Ono, Isao; Kobayashi, Shigenobu

    The stabilization control of nonholonomic systems have been extensively studied because it is essential for nonholonomic robot control problems. The difficulty in this problem is that the theoretical derivation of control policy is not necessarily guaranteed achievable. In this paper, we present a reinforcement learning (RL) method with instance-based policy (IBP) representation, in which control policies for this class are optimized with respect to user-defined cost functions. Direct policy search (DPS) is an approach for RL; the policy is represented by parametric models and the model parameters are directly searched by optimization techniques including genetic algorithms (GAs). In IBP representation an instance consists of a state and an action pair; a policy consists of a set of instances. Several DPSs with IBP have been previously proposed. In these methods, sometimes fail to obtain optimal control policies when state-action variables are continuous. In this paper, we present a real-coded GA for DPSs with IBP. Our method is specifically designed for continuous domains. Optimization of IBP has three difficulties; high-dimensionality, epistasis, and multi-modality. Our solution is designed for overcoming these difficulties. The policy search with IBP representation appears to be high-dimensional optimization; however, instances which can improve the fitness are often limited to active instances (instances used for the evaluation). In fact, the number of active instances is small. Therefore, we treat the search problem as a low dimensional problem by restricting search variables only to active instances. It has been commonly known that functions with epistasis can be efficiently optimized with crossovers which satisfy the inheritance of statistics. For efficient search of IBP, we propose extended crossover-like mutation (extended XLM) which generates a new instance around an instance with satisfying the inheritance of statistics. For overcoming multi-modality, we

  7. Game E-Learning Code Master Dengan Konsep Mmorpg Menggunakan Adobe Flex 3

    Directory of Open Access Journals (Sweden)

    Fredy Purnomo

    2010-12-01

    Full Text Available The research objective is to design a web-based e-learning game that could be used to be a learning facility of C language programming and as an online game so it could be enjoyed by everybody easily in internet. Flex usage in this game online is to implement RIA (Rich Internet Application concept in the game so e-learning process is hoped to be more interesting and interactive. E-learning game is also designed in MMORPG (Massively Multiplayer Online Role Playing Game concept. The research method used is analysis and design method. Analysis method is done through literature study, user analysis, and similar game analysis. Meanwhile, design method is about monitor display, gameplay and system design. The conclution of this research is that this game provides an interesting learning media of C language program accordingly to subject material at class and also easier to use through website. 

  8. The neural coding of expected and unexpected monetary performance outcomes: dissociations between active and observational learning.

    Science.gov (United States)

    Bellebaum, C; Jokisch, D; Gizewski, E R; Forsting, M; Daum, I

    2012-02-01

    Successful adaptation to the environment requires the learning of stimulus-response-outcome associations. Such associations can be learned actively by trial and error or by observing the behaviour and accompanying outcomes in other persons. The present study investigated similarities and differences in the neural mechanisms of active and observational learning from monetary feedback using functional magnetic resonance imaging. Two groups of 15 subjects each - active and observational learners - participated in the experiment. On every trial, active learners chose between two stimuli and received monetary feedback. Each observational learner observed the choices and outcomes of one active learner. Learning performance as assessed via active test trials without feedback was comparable between groups. Different activation patterns were observed for the processing of unexpected vs. expected monetary feedback in active and observational learners, particularly for positive outcomes. Activity for unexpected vs. expected reward was stronger in the right striatum in active learning, while activity in the hippocampus was bilaterally enhanced in observational and reduced in active learning. Modulation of activity by prediction error (PE) magnitude was observed in the right putamen in both types of learning, whereas PE related activations in the right anterior caudate nucleus and in the medial orbitofrontal cortex were stronger for active learning. The striatum and orbitofrontal cortex thus appear to link reward stimuli to own behavioural reactions and are less strongly involved when the behavioural outcome refers to another person's action. Alternative explanations such as differences in reward value between active and observational learning are also discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. The non-coding RNA BC1 regulates experience-dependent structural plasticity and learning

    NARCIS (Netherlands)

    Briz, Victor; Restivo, Leonardo; Pasciuto, Emanuela; Juczewski, Konrad; Mercaldo, Valentina; Lo, Adrian C; Baatsen, Pieter; Gounko, Natalia V; Borreca, Antonella; Girardi, Tiziana; Luca, Rossella; Nys, Julie; Poorthuis, Rogier B; Mansvelder, Huibert D; Fisone, Gilberto; Ammassari-Teule, Martine; Arckens, Lutgarde; Krieger, Patrik; Meredith, Rhiannon; Bagni, Claudia

    2017-01-01

    The brain cytoplasmic (BC1) RNA is a non-coding RNA (ncRNA) involved in neuronal translational control. Absence of BC1 is associated with altered glutamatergic transmission and maladaptive behavior. Here, we show that pyramidal neurons in the barrel cortex of BC1 knock out (KO) mice display larger

  10. Sound Synthesis and Bar-Code Technology to Develop Learning Environments for Blind Children.

    Science.gov (United States)

    Burger, D.; And Others

    1990-01-01

    An interactive, computerized sound machine was designed, incorporating bar-code technology in the user interface. The system was used in a classroom of nine blind elementary level children to teach sound awareness, logic, metalinguistics, and technological literacy and was found to have pedagogical relevance. (Author/JDD)

  11. Stroop-like effects in a new-code learning task: A cognitive load theory perspective.

    Science.gov (United States)

    Hazan-Liran, Batel; Miller, Paul

    2017-09-01

    To determine whether and how learning is biased by competing task-irrelevant information that creates extraneous cognitive load, we assessed the efficiency of university students with a learning paradigm in two experiments. The paradigm asked participants to learn associations between eight words and eight digits. We manipulated congruity of the digits' ink colour with the words' semantics. In Experiment 1 word stimuli were colour words (e.g., blue, yellow) and in Experiment 2 colour-related word concepts (e.g., sky, banana). Marked benefits and costs on learning due to variation in extraneous cognitive load originating from processing task-irrelevant information were evident. Implications for cognitive load theory and schooling are discussed.

  12. Writing Strengthens Orthography and Alphabetic-Coding Strengthens Phonology in Learning to Read Chinese

    NARCIS (Netherlands)

    Guan, C.Q.; Liu, Y.; Chan, D.H.L.; Ye, F.F.; Perfetti, C.A.

    2011-01-01

    Learning to write words may strengthen orthographic representations and thus support word-specific recognition processes. This hypothesis applies especially to Chinese because its writing system encourages character-specific recognition that depends on accurate representation of orthographic form.

  13. Geoethics: what can we learn from existing bio-, ecological, and engineering ethics codes?

    Science.gov (United States)

    Kieffer, Susan W.; Palka, John

    2014-05-01

    Many scientific disciplines are concerned about ethics, and codes of ethics for these professions exist, generally through the professional scientific societies such as the American Geophysical Union (AGU), American Geological Institute (AGI), American Association of Petroleum Engineers (AAPE), National Society of Professional Engineers (NSPE), Ecological Society of America (ESA), and many others worldwide. These vary considerably in depth and specificity. In this poster, we review existing codes with the goal of extracting fundamentals that should/can be broadly applied to all geo-disciplines. Most of these codes elucidate a set of principles that cover practical issues such as avoiding conflict of interest, avoiding plagiarism, not permitting illegitimate use of intellectual products, enhancing the prestige of the profession, acknowledging an obligation to perform services only in areas of competence, issuing public statements only in an objective manner, holding paramount the welfare of the public, and in general conducting oneself honorably, responsibly, and lawfully. It is striking that, given that the work of these societies and their members is relevant to the future of the earth, few discuss in any detail ethical obligations regarding our relation to the planet itself. The AGU code, for example, only states that "Members have an ethical obligation to weigh the societal benefits of their research against the costs and risks to human and animal welfare and impacts on the environment and society." The NSPE and AGI codes go somewhat further: "Engineers are encouraged to adhere to the principles of sustainable development in order to protect the environment for future generations," and "Geoscientists should strive to protect our natural environment. They should understand and anticipate the environmental consequences of their work and should disclose the consequences of recommended actions. They should acknowledge that resource extraction and use are necessary

  14. Pupil dilation indicates the coding of past prediction errors: Evidence for attentional learning theory.

    Science.gov (United States)

    Koenig, Stephan; Uengoer, Metin; Lachnit, Harald

    2018-04-01

    The attentional learning theory of Pearce and Hall () predicts more attention to uncertain cues that have caused a high prediction error in the past. We examined how the cue-elicited pupil dilation during associative learning was linked to such error-driven attentional processes. In three experiments, participants were trained to acquire associations between different cues and their appetitive (Experiment 1), motor (Experiment 2), or aversive (Experiment 3) outcomes. All experiments were designed to examine differences in the processing of continuously reinforced cues (consistently followed by the outcome) versus partially reinforced, uncertain cues (randomly followed by the outcome). We measured the pupil dilation elicited by the cues in anticipation of the outcome and analyzed how this conditioned pupil response changed over the course of learning. In all experiments, changes in pupil size complied with the same basic pattern: During early learning, consistently reinforced cues elicited greater pupil dilation than uncertain, randomly reinforced cues, but this effect gradually reversed to yield a greater pupil dilation for uncertain cues toward the end of learning. The pattern of data accords with the changes in prediction error and error-driven attention formalized by the Pearce-Hall theory. © 2017 The Authors. Psychophysiology published by Wiley Periodicals, Inc. on behalf of Society for Psychophysiological Research.

  15. Criminal procedure code and charter of criminal proceedings in terrorism investigation: learning from the past

    Directory of Open Access Journals (Sweden)

    Makarov M.A.

    2014-12-01

    Full Text Available Basic procedural institutions of terrorism investigation are studied by comparing the provisions of the Charter of criminal proceedings of the Russian Empire in 1864 and the current procedural law. The comparative solutions to the following investigation problems by Charter and the Code are shown: 1 terrorists confess to less serious crimes, representing themselves as accomplices (articles 208, 222 of the RF Criminal Code to avoid punishment for terrorism (article 205 of the RF Criminal Code; 2 mass absence of prosecution witnesses at the hearing, giving reason to doubt the objectivity and the admissibility of evidence; 3 low efficiency of overt procedural activities, the need for covert operations based not on the particular fact, but on crime detection actions against terrorist organizations members; 4 the use of force during the arrest, causing the terrorists death, excludes the achievement of criminal law and criminal justice goals and also leads to the loss of evidence (to prove the guilt of the survived terrorists; 5 a significant amount of time passes between the alleged crime and passing the sentence, the minimum time of the terrorism investigation is 12 months, as a result higher courts stop the prosecution (changing the sentence for terrorists Atgeriev, Alkhazurov, Gaysumov in April 2002 by the RF Supreme Court due to the fact that more than five years have passed from the day of committing crimes till passing the sentence. The authors come to the paradoxical conclusion that the procedural law of the XIX century was much more effective than modern one.

  16. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  17. Software Quality and Security in Teachers' and Students' Codes When Learning a New Programming Language

    Science.gov (United States)

    Boutnaru, Shlomi; Hershkovitz, Arnon

    2015-01-01

    In recent years, schools (as well as universities) have added cyber security to their computer science curricula. This topic is still new for most of the current teachers, who would normally have a standard computer science background. Therefore the teachers are trained and then teaching their students what they have just learned. In order to…

  18. Feedback-related negativity codes outcome valence, but not outcome expectancy, during reversal learning

    NARCIS (Netherlands)

    Borries, A.K.L. von; Verkes, R.J.; Bulten, B.H.; Cools, R.; Bruijn, E.R.A. de

    2013-01-01

    Optimal behavior depends on the ability to assess the predictive value of events and to adjust behavior accordingly. Outcome processing can be studied by using its electrophysiological signatures-that is, the feedback-related negativity (FRN) and the P300. A prominent reinforcement-learning model

  19. Feedback-related negativity codes outcome valence, but not outcome expectancy, during reversal learning

    NARCIS (Netherlands)

    Borries, A.K.L. von; Verkes, R.J.; Bulten, B.H.; Cools, R.

    2013-01-01

    Optimal behavior depends on the ability to assess the predictive value of events and to adjust behavior accordingly. Outcome processing can be studied by using its electrophysiological signatures--that is, the feedback-related negativity (FRN) and the P300. A prominent reinforcement-learning model

  20. Effects of Dual Coded Multimedia Instruction Employing Image Morphing on Learning a Logographic Language

    Science.gov (United States)

    Wang, Ling; Blackwell, Aleka Akoyunoglou

    2015-01-01

    Native speakers of alphabetic languages, which use letters governed by grapheme-phoneme correspondence rules, often find it particularly challenging to learn a logographic language whose writing system employs symbols with no direct sound-to-spelling connection but links to the visual and semantic information. The visuospatial properties of…

  1. Color-Coded Graphic Organizers for Teaching Writing to Students with Learning Disabilities

    Science.gov (United States)

    Ewoldt, Kathy B.; Morgan, Joseph John

    2017-01-01

    A commonly used method for supporting the writing of students with learning disabilities (LD), graphic organizers have been shown to effectively support instruction for students with LD in a variety of content areas (Dexter & Hughes, 2011). Students with LD often struggle with the process of developing their ideas into organized sentences; the…

  2. Synaptic learning rules and sparse coding in a model sensory system.

    Directory of Open Access Journals (Sweden)

    Luca A Finelli

    2008-04-01

    Full Text Available Neural circuits exploit numerous strategies for encoding information. Although the functional significance of individual coding mechanisms has been investigated, ways in which multiple mechanisms interact and integrate are not well understood. The locust olfactory system, in which dense, transiently synchronized spike trains across ensembles of antenna lobe (AL neurons are transformed into a sparse representation in the mushroom body (MB; a region associated with memory, provides a well-studied preparation for investigating the interaction of multiple coding mechanisms. Recordings made in vivo from the insect MB demonstrated highly specific responses to odors in Kenyon cells (KCs. Typically, only a few KCs from the recorded population of neurons responded reliably when a specific odor was presented. Different odors induced responses in different KCs. Here, we explored with a biologically plausible model the possibility that a form of plasticity may control and tune synaptic weights of inputs to the mushroom body to ensure the specificity of KCs' responses to familiar or meaningful odors. We found that plasticity at the synapses between the AL and the MB efficiently regulated the delicate tuning necessary to selectively filter the intense AL oscillatory output and condense it to a sparse representation in the MB. Activity-dependent plasticity drove the observed specificity, reliability, and expected persistence of odor representations, suggesting a role for plasticity in information processing and making a testable prediction about synaptic plasticity at AL-MB synapses.

  3. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and softwa...... expression in the public realm. The book’s line of argument defends language against its invasion by economics, arguing that speech continues to underscore the human condition, however paradoxical this may seem in an era of pervasive computing....

  4. A numerical similarity approach for using retired Current Procedural Terminology (CPT) codes for electronic phenotyping in the Scalable Collaborative Infrastructure for a Learning Health System (SCILHS).

    Science.gov (United States)

    Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N

    2015-12-11

    Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer

  5. Conceptual aspects: analyses law, ethical, human, technical, social factors of development ICT, e-learning and intercultural development in different countries setting out the previous new theoretical model and preliminary findings

    NARCIS (Netherlands)

    Kommers, Petrus A.M.; Smyrnova-Trybulska, Eugenia; Morze, Natalia; Issa, Tomayess; Issa, Theodora

    2015-01-01

    This paper, prepared by an international team of authors focuses on the conceptual aspects: analyses law, ethical, human, technical, social factors of ICT development, e-learning and intercultural development in different countries, setting out the previous and new theoretical model and preliminary

  6. Learning from a provisioning site: code of conduct compliance and behaviour of whale sharks in Oslob, Cebu, Philippines

    Directory of Open Access Journals (Sweden)

    Anna Schleimer

    2015-11-01

    Full Text Available While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take

  7. Learning from a provisioning site: code of conduct compliance and behaviour of whale sharks in Oslob, Cebu, Philippines.

    Science.gov (United States)

    Schleimer, Anna; Araujo, Gonzalo; Penketh, Luke; Heath, Anna; McCoy, Emer; Labaja, Jessica; Lucey, Anna; Ponzo, Alessandro

    2015-01-01

    While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus) at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m) increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take preventive actions even

  8. Distributed Video Coding: Iterative Improvements

    DEFF Research Database (Denmark)

    Luong, Huynh Van

    at the decoder side offering such benefits for these applications. Although there have been some advanced improvement techniques, improving the DVC coding efficiency is still challenging. The thesis addresses this challenge by proposing several iterative algorithms at different working levels, e.g. bitplane...... and noise modeling and also learn from the previous decoded Wyner-Ziv (WZ) frames, side information and noise learning (SING) is proposed. The SING scheme introduces an optical flow technique to compensate the weaknesses of the block based SI generation and also utilizes clustering of DCT blocks to capture...

  9. Morse Code Activity Packet.

    Science.gov (United States)

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  10. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Laurent, Alexis; Miseljic, Mirko

    2012-01-01

    While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance...... on how to practically apply these methods are still very much under development. This paper evaluates how research efforts have applied LCA and RA together for NM, particularly reflecting on previous experiences with applying these methods to chemicals. Through a literature review and a separate analysis...... of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key ‘‘lessons learned’’ from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches...

  11. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    International Nuclear Information System (INIS)

    Grieger, Khara D.; Laurent, Alexis; Miseljic, Mirko; Christensen, Frans; Baun, Anders; Olsen, Stig I.

    2012-01-01

    While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance on how to practically apply these methods are still very much under development. This paper evaluates how research efforts have applied LCA and RA together for NM, particularly reflecting on previous experiences with applying these methods to chemicals. Through a literature review and a separate analysis of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key “lessons learned” from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches for using these methods together for NM: “LC-based RA” (traditional RA applied in a life-cycle perspective) and “RA-complemented LCA” (conventional LCA supplemented by RA in specific life-cycle steps). Hence, the latter is the only identified approach which genuinely combines LC- and RA-based methods for NM-risk research efforts to date as the former is rather a continuation of normal RA according to standard assessment procedures (e.g., REACH). Both these approaches along with recommendations for using LCA and RA together for NM are similar to those made previously for chemicals, and thus, there does not appear to be much progress made specific for NM. We have identified one issue in particular that may be specific for NM when applying LCA and RA at this time: the need to establish proper dose metrics within both methods.

  12. Accurate discrimination of conserved coding and non-coding regions through multiple indicators of evolutionary dynamics

    Directory of Open Access Journals (Sweden)

    Pesole Graziano

    2009-09-01

    Full Text Available Abstract Background The conservation of sequences between related genomes has long been recognised as an indication of functional significance and recognition of sequence homology is one of the principal approaches used in the annotation of newly sequenced genomes. In the context of recent findings that the number non-coding transcripts in higher organisms is likely to be much higher than previously imagined, discrimination between conserved coding and non-coding sequences is a topic of considerable interest. Additionally, it should be considered desirable to discriminate between coding and non-coding conserved sequences without recourse to the use of sequence similarity searches of protein databases as such approaches exclude the identification of novel conserved proteins without characterized homologs and may be influenced by the presence in databases of sequences which are erroneously annotated as coding. Results Here we present a machine learning-based approach for the discrimination of conserved coding sequences. Our method calculates various statistics related to the evolutionary dynamics of two aligned sequences. These features are considered by a Support Vector Machine which designates the alignment coding or non-coding with an associated probability score. Conclusion We show that our approach is both sensitive and accurate with respect to comparable methods and illustrate several situations in which it may be applied, including the identification of conserved coding regions in genome sequences and the discrimination of coding from non-coding cDNA sequences.

  13. PREVIOUS SECOND TRIMESTER ABORTION

    African Journals Online (AJOL)

    PNLC

    PREVIOUS SECOND TRIMESTER ABORTION: A risk factor for third trimester uterine rupture in three ... for accurate diagnosis of uterine rupture. KEY WORDS: Induced second trimester abortion - Previous uterine surgery - Uterine rupture. ..... scarred uterus during second trimester misoprostol- induced labour for a missed ...

  14. Explaining Research Utilization Among 4-H Faculty, Staff, and Volunteers: The Role of Self-Efficacy, Learning Goal Orientation, Training, and Previous Experience

    Directory of Open Access Journals (Sweden)

    Julianne Tillman

    2014-06-01

    Full Text Available An investigation of factors that facilitate the utilization of research evidence among faculty, staff, and volunteers in the 4-H Youth Development Program is presented in this paper. Participants (N= 368; 86 4-H faculty, 153 staff, and 129 volunteers represented 35 states; structural equation modeling was utilized in the analyses. Results of the path analysis explained 56% of variance in research utilization and 28% in research utilization self-efficacy. Among the factors impacting research utilization, self-efficacy played the most important role. In turn, self-efficacy for research utilization was positively influenced by participants’ learning goal orientation, frequency of 4-H training during the last 12 months, education in research-related areas, and investigative career interests. In addition, 4-H staff who were exposed to research at higher levels reported higher research utilization self-efficacy. The findings reinforce the importance of fostering research utilization self-efficacy among 4-H faculty, staff, and volunteers. Among the suggestions presented are regular 4-H training opportunities and on-going exposure to program evaluation and program improvement experiences.

  15. The relationship between phonological codes on memory and spelling tasks for students with and without learning disabilities.

    Science.gov (United States)

    Swanson, H L; Ramalgia, J M

    1992-01-01

    The purpose of the study was to determine the degree to which 31 (23 boys and 8 girls) 13-year-old children with learning disabilities from Grades 7, 8, and 9 were comparable to younger (9-year-old) reading- and spelling-matched controls in (a) phonological similarity effects, (b) phonetically based misspellings, and (c) relationships between memory and spelling performance. Children with reading disabilities and reading-recognition-matched controls, subgrouped by spelling ability, were compared on their memory for phonetically similar and dissimilar word lists and types of spelling errors. The results indicate that children with reading disabilities who are matched to younger children on both reading recognition and spelling ability exhibit normal phonological effects on memory and spelling measures. Within each reading group, low spellers produced more semiphonetic errors than high spellers, and high spellers produced more phonetic errors than low spellers. Significant correlations between memory and spelling error measures were more frequent for children with reading disabilities when compared to controls matched on reading and spelling ability. It was concluded that the phonological performance of reading/spelling-matched children with reading disabilities is characterized by an overreliance on phonological codes, whereas their counterparts' performance reflects independent and less generalizable use of phonological substrates across tasks.

  16. Code of Conduct for wind-power projects - Phases 1 and 2; Code of Conduct fuer windkraftprojekte. Phase 1 und 2 - Systemanalyse, Lessons Learned und Bewertung bestehender Instrumente

    Energy Technology Data Exchange (ETDEWEB)

    Strub, P. [Pierre Strub, freischaffender Berater, Binningen (Switzerland); Ziegler, Ch. [Inter Act, Basel (Switzerland)

    2008-08-15

    This paper discusses the results of the first two phases of a project concerning wind-power projects. The paper deals with the results of a system analysis, takes a look at lessons learned and presents an appraisal of existing instruments. A system-analysis of wind-power projects is presented with emphasis on social factors and the role of stakeholders. The success factors concerning social acceptance of wind-power projects and their special characteristics are discussed. Lessons learned are examined. Instruments for the sustainable implementation of projects are looked at, in particular with a focus on social acceptance

  17. A mouse model of visual perceptual learning reveals alterations in neuronal coding and dendritic spine density in the visual cortex

    Directory of Open Access Journals (Sweden)

    Yan eWang

    2016-03-01

    Full Text Available Visual perceptual learning (VPL can improve spatial vision in normally sighted and visually impaired individuals. Although previous studies of humans and large animals have explored the neural basis of VPL, elucidation of the underlying cellular and molecular mechanisms remains a challenge. Owing to the advantages of molecular genetic and optogenetic manipulations, the mouse is a promising model for providing a mechanistic understanding of VPL. Here, we thoroughly evaluated the effects and properties of VPL on spatial vision in C57BL/6J mice using a two-alternative, forced-choice visual water task. Briefly, the mice underwent prolonged training at near the individual threshold of contrast or spatial frequency (SF for pattern discrimination or visual detection for 35 consecutive days. Following training, the contrast-threshold trained mice showed an 87% improvement in contrast sensitivity (CS and a 55% gain in visual acuity (VA. Similarly, the SF-threshold trained mice exhibited comparable and long-lasting improvements in VA and significant gains in CS over a wide range of SFs. Furthermore, learning largely transferred across eyes and stimulus orientations. Interestingly, learning could transfer from a pattern discrimination task to a visual detection task, but not vice versa. We validated that this VPL fully restored VA in adult amblyopic mice and old mice. Taken together, these data indicate that mice, as a species, exhibit reliable VPL. Intrinsic signal optical imaging revealed that mice with perceptual training had higher cut-off SFs in primary visual cortex (V1 than those without perceptual training. Moreover, perceptual training induced an increase in the dendritic spine density in layer 2/3 pyramidal neurons of V1. These results indicated functional and structural alterations in V1 during VPL. Overall, our VPL mouse model will provide a platform for investigating the neurobiological basis of VPL.

  18. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  19. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  20. Code-switching as a communication, learning, and social negotiation stategy in first-year learners of Danish

    DEFF Research Database (Denmark)

    Arnfast, Juni Søderberg; Jørgensen, Jens Normann

    2003-01-01

    the borderline between the two concepts by investigating the use of code-switching in first year learners of Danish.Our study points out that code-switching appears as a skill used in early attempts of playing with the languages involved in the conversation (Danish/English and Danish/Polish/German/English). Thus...... we have to acknowledge code-switching as an increasingly sophisticated language skill even at an early stage of SLA....

  1. The International Code of Marketing of Breast-milk Substitutes: lessons learned and implications for the regulation of marketing of foods and beverages to children.

    Science.gov (United States)

    Lutter, Chessa K

    2013-10-01

    To identify lessons learned from 30 years of implementing the International Code of Marketing of Breast-milk Substitutes (‘the Code’) and identify lessons learned for the regulation of marketing foods and beverages to children. Historical analysis of 30 years of implementing the Code. Latin America and the Caribbean. None. Legislation to restrict marketing of breast-milk substitutes is necessary but not sufficient; equally important are the promulgation of implementing regulations, effective enforcement and public monitoring of compliance. A system of funding for regular monitoring of compliance with legislation should be explicitlyd eveloped and funded from the beginning. Economic sanctions, while important, are likely to be less effective than reports that affect a company’s public image negatively. Non-governmental organizations play a critical role in leveraging public opinion and galvanizing consumer pressure to ensure that governments adopt regulations and companies adhere to them. Continual clinical, epidemiological and policy research showing the link between marketing and health outcomes and between policy and better health is essential. Implementation of the Code has not come easily as it places the interests of underfinanced national governments and international and non-governmental organizations promoting breast-feeding against those of multinational corporations that make hundreds of millions of dollars annually marketing infant formulas. Efforts to protect, promote and support breast-feeding have been successful with indicators of breast-feeding practices increasing globally. The lessons learned can inform current efforts to regulate the marketing of foods and beverages to children.

  2. Expression of c-Fos in the rat retrosplenial cortex during instrumental re-learning of appetitive bar-pressing depends on the number of stages of previous training

    Directory of Open Access Journals (Sweden)

    Olga E. Svarnik

    2013-07-01

    Full Text Available Learning is known to be accompanied by induction of c-Fos expression in cortical neurons. However, not all neurons are involved in this process. What the c-Fos expression pattern depends on is still unknown. In the present work we studied whether and to what degree previous animal experience about Task 1 influenced neuronal c-Fos expression in the retrosplenial cortex during acquisition of Task 2. Animals were progressively shaped across days to bar-press for food at the left side of the experimental chamber (Task 1. This appetitive bar-pressing behavior was shaped by nine stages ("9 stages" group, five stages ("5 stages" group or one intermediate stage ("1 stage" group. After all animals acquired the first skill and practiced it for five days, the bar and feeder on the left, familiar side of the chamber were inactivated, and the animals were allowed to learn a similar instrumental task at the opposite side of the chamber using another pair of a bar and a feeder (Task 2. The highest number of c-Fos positive neurons was found in the retrosplenial cortex of "1 stage" animals as compared to the other groups. The number of c-Fos positive neurons in "5 stages" group animals was significantly lower than in "1 stage" animals and significantly higher than in "9 stages" animals. The number of c-Fos positive neurons in the cortex of "9 stages" animals was significantly higher than in home caged control animals. At the same time, there were no significant differences between groups in such behavioral variables as the number of entrees into the feeder or bar zones during Task 2 learning. Our results suggest that c-Fos expression in the retrosplenial cortex during Task 2 acquisition was influenced by the previous learning history.

  3. An Extension to the Constructivist Coding Hypothesis as a Learning Model for Selective Feedback when the Base Rate Is High

    Science.gov (United States)

    Ghaffarzadegan, Navid; Stewart, Thomas R.

    2011-01-01

    Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…

  4. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  5. Classification of multispectral or hyperspectral satellite imagery using clustering of sparse approximations on sparse representations in learned dictionaries obtained using efficient convolutional sparse coding

    Energy Technology Data Exchange (ETDEWEB)

    Moody, Daniela; Wohlberg, Brendt

    2018-01-02

    An approach for land cover classification, seasonal and yearly change detection and monitoring, and identification of changes in man-made features may use a clustering of sparse approximations (CoSA) on sparse representations in learned dictionaries. The learned dictionaries may be derived using efficient convolutional sparse coding to build multispectral or hyperspectral, multiresolution dictionaries that are adapted to regional satellite image data. Sparse image representations of images over the learned dictionaries may be used to perform unsupervised k-means clustering into land cover categories. The clustering process behaves as a classifier in detecting real variability. This approach may combine spectral and spatial textural characteristics to detect geologic, vegetative, hydrologic, and man-made features, as well as changes in these features over time.

  6. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  7. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  8. A machine-learning approach to coding book reviews as quality indicators: Toward a theory of megacitation

    NARCIS (Netherlands)

    Zuccala, A.; van Someren, M.; van Bellen, M.

    2014-01-01

    A theory of “megacitation” is introduced and used in an experiment to demonstrate how a qualitative scholarly book review can be converted into a weighted bibliometric indicator. We employ a manual human-coding approach to classify book reviews in the field of history based on reviewers' assessments

  9. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  10. What Is the Moral Imperative of Workplace Learning: Unlocking the DaVinci Code of Human Resource Development?

    Science.gov (United States)

    Short, Tom

    2006-01-01

    In the course of the author's doctoral study, he is exploring the strategic linkages between learning activities in the modern workplace and the long-term success they bring to organisations. For many years, this challenge has been the Holy Grail of human resource (HR) development practitioners, who invest heavily on training and professional…

  11. Variation of the gene coding for DARPP-32 (PPP1R1B) and brain connectivity during associative emotional learning

    NARCIS (Netherlands)

    Curcic-Blake, Branislava; Swart, Marte; Ter Horst, Gert J.; Langers, Dave R. M.; Kema, Ido P.; Aleman, Andre

    2012-01-01

    Associative emotional learning, which is important for the social emotional functioning of individuals and is often impaired in psychiatric illnesses, is in part mediated by dopamine and glutamate pathways in the brain. The protein DARPP-32 is involved in the regulation of dopaminergic and

  12. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class......Sammenfatning af de mest væsentlige pointer fra hovedrapporten: Dokumentation og evaluering af Coding Class...

  13. Decomposing experience-driven attention: opposite attentional effects of previously predictive cues

    Science.gov (United States)

    Lin, Zhicheng; Lu, Zhong-Lin; He, Sheng

    2016-01-01

    A central function of the brain is to track the dynamic statistical regularities in the environment—such as what predicts what over time. How does this statistical learning process alter sensory and attentional processes? Drawing upon animal conditioning and predictive coding, we developed a learning procedure that revealed two distinct components through which prior learning-experience controls attention. During learning, a visual search task was used in which the target randomly appeared at one of several locations but always inside an encloser of a particular color—the learned color served to direct attention to the target location. During test, the color no longer predicted the target location. When the same search task was used in the subsequent test, we found that the learned color continued to attract attention despite the behavior being counterproductive for the task and despite the presence of a completely predictive cue. However, when tested with a flanker task that had minimal location uncertainty—the target was at the fixation surrounded by a distractor—participants were better at ignoring distractors in the learned color than other colors. Evidently, previously predictive cues capture attention in the same search task but can be better suppressed in a flanker task. These results demonstrate opposing components—capture and inhibition—in experience-driven attention, with their manifestations crucially dependent on task context. We conclude that associative learning enhances context-sensitive top-down modulation while reduces bottom-up sensory drive and facilitates suppression, supporting a learning-based predictive coding account. PMID:27068051

  14. An algebraic approach to graph codes

    DEFF Research Database (Denmark)

    Pinero, Fernando

    theory as evaluation codes. Chapter three consists of the introduction to graph based codes, such as Tanner codes and graph codes. In Chapter four, we compute the dimension of some graph based codes with a result combining graph based codes and subfield subcodes. Moreover, some codes in chapter four...... are optimal or best known for their parameters. In chapter five we study some graph codes with Reed–Solomon component codes. The underlying graph is well known and widely used for its good characteristics. This helps us to compute the dimension of the graph codes. We also introduce a combinatorial concept...... related to the iterative encoding of graph codes with MDS component code. The last chapter deals with affine Grassmann codes and Grassmann codes. We begin with some previously known codes and prove that they are also Tanner codes of the incidence graph of the point–line partial geometry...

  15. Learning Heroku Postgres

    CERN Document Server

    Espake, Patrick

    2015-01-01

    Learning Heroku Postgres is targeted at developers and database admins. Even if you're new to Heroku Postgres, you'll be able to master both the basic as well as advanced features of Heroku Postgres. Since Heroku Postgres is incredibly user-friendly, no previous experience in computer coding or programming is required.

  16. Greater striatopallidal adaptive coding during cue-reward learning and food reward habituation predict future weight gain.

    Science.gov (United States)

    Burger, Kyle S; Stice, Eric

    2014-10-01

    Animal experiments indicate that after repeated pairings of palatable food receipt and cues that predict palatable food receipt, dopamine signaling increases in response to predictive cues, but decreases in response to food receipt. Using functional MRI and mixed effects growth curve models with 35 females (M age=15.5±0.9; M BMI=24.5±5.4) we documented an increase in BOLD response in the caudate (r=.42) during exposure to cues predicting impending milkshake receipt over repeated exposures, demonstrating a direct measure of in vivo cue-reward learning in humans. Further, we observed a simultaneous decrease in putamen (r=-.33) and ventral pallidum (r=-.45) response during milkshake receipt that occurred over repeated exposures, putatively reflecting food reward habitation. We then tested whether cue-reward learning and habituation slopes predicted future weight over 2-year follow-up. Those who exhibited the greatest escalation in ventral pallidum responsivity to cues and the greatest decrease in caudate response to milkshake receipt showed significantly larger increases in BMI (r=.39 and -.69 respectively). Interestingly, cue-reward learning propensity and food reward habituation were not correlated, implying that these factors may constitute qualitatively distinct vulnerability pathways to excess weight gain. These two individual difference factors may provide insight as to why certain people have shown obesity onset in response to the current obesogenic environment in western cultures, whereas others have not. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. The Languages of Neurons: An Analysis of Coding Mechanisms by Which Neurons Communicate, Learn and Store Information

    Directory of Open Access Journals (Sweden)

    Morris H. Baslow

    2009-11-01

    Full Text Available In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron uses a simple mechanism for transmitting information. This is in the form of temporal electrophysiological action potentials or spikes (S operating on a millisecond timescale that, along with pauses (P between spikes constitute a two letter “alphabet” that generates meaningful frequency-encoded signals or neuronal S/P “words” in a primary language. However, when a word from an afferent neuron enters the dendritic-synaptic-dendritic field between two neurons, it is translated into a new frequency-encoded word with the same meaning, but in a different spike-pause language, that is delivered to and understood by the efferent neuron. It is suggested that this unidirectional inter-neuronal language-based word translation step is of utmost importance to brain function in that it allows for variations in meaning to occur. Thus, structural or biochemical changes in dendrites or synapses can produce novel words in the second language that have changed meanings, allowing for a specific signaling experience, either external or internal, to modify the meaning of an original word (learning, and store the learned information of that experience (memory in the form of an altered dendritic-synaptic-dendritic field.

  18. Network Coding

    Indian Academy of Sciences (India)

    message symbols downstream, network coding achieves vast performance gains by permitting intermediate nodes to carry out algebraic oper- ations on the incoming data. In this article we present a tutorial introduction to network coding as well as an application to the e±cient operation of distributed data-storage networks.

  19. Semi-supervised sparse coding

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    Sparse coding approximates the data sample as a sparse linear combination of some basic codewords and uses the sparse codes as new presentations. In this paper, we investigate learning discriminative sparse codes by sparse coding in a semi-supervised manner, where only a few training samples are labeled. By using the manifold structure spanned by the data set of both labeled and unlabeled samples and the constraints provided by the labels of the labeled samples, we learn the variable class labels for all the samples. Furthermore, to improve the discriminative ability of the learned sparse codes, we assume that the class labels could be predicted from the sparse codes directly using a linear classifier. By solving the codebook, sparse codes, class labels and classifier parameters simultaneously in a unified objective function, we develop a semi-supervised sparse coding algorithm. Experiments on two real-world pattern recognition problems demonstrate the advantage of the proposed methods over supervised sparse coding methods on partially labeled data sets.

  20. Striatal and Tegmental Neurons Code Critical Signals for Temporal-Difference Learning of State Value in Domestic Chicks

    Directory of Open Access Journals (Sweden)

    Chentao Wen

    2016-11-01

    Full Text Available To ensure survival, animals must update the internal representations of their environment in a trial-and-error fashion. Psychological studies of associative learning and neurophysiological analyses of dopaminergic neurons have suggested that this updating process involves the temporal-difference (TD method in the basal ganglia network. However, the way in which the component variables of the TD method are implemented at the neuronal level is unclear. To investigate the underlying neural mechanisms, we trained domestic chicks to associate color cues with food rewards. We recorded neuronal activities from the medial striatum or tegmentum in a freely behaving condition and examined how reward omission changed neuronal firing. To compare neuronal activities with the signals assumed in the TD method, we simulated the behavioral task in the form of a finite sequence composed of discrete steps of time. The three signals assumed in the simulated task were the prediction signal, the target signal for updating, and the TD-error signal. In both the medial striatum and tegmentum, the majority of recorded neurons were categorized into three types according to their fitness for three models, though these neurons tended to form a continuum spectrum without distinct differences in the firing rate. Specifically, two types of striatal neurons successfully mimicked the target signal and the prediction signal. A linear summation of these two types of striatum neurons was a good fit for the activity of one type of tegmental neurons mimicking the TD-error signal. The present study thus demonstrates that the striatum and tegmentum can convey the signals critically required for the TD method. Based on the theoretical and neurophysiological studies, together with tract-tracing data, we propose a novel model to explain how the convergence of signals represented in the striatum could lead to the computation of TD error in tegmental dopaminergic neurons.

  1. Striatal and Tegmental Neurons Code Critical Signals for Temporal-Difference Learning of State Value in Domestic Chicks.

    Science.gov (United States)

    Wen, Chentao; Ogura, Yukiko; Matsushima, Toshiya

    2016-01-01

    To ensure survival, animals must update the internal representations of their environment in a trial-and-error fashion. Psychological studies of associative learning and neurophysiological analyses of dopaminergic neurons have suggested that this updating process involves the temporal-difference (TD) method in the basal ganglia network. However, the way in which the component variables of the TD method are implemented at the neuronal level is unclear. To investigate the underlying neural mechanisms, we trained domestic chicks to associate color cues with food rewards. We recorded neuronal activities from the medial striatum or tegmentum in a freely behaving condition and examined how reward omission changed neuronal firing. To compare neuronal activities with the signals assumed in the TD method, we simulated the behavioral task in the form of a finite sequence composed of discrete steps of time. The three signals assumed in the simulated task were the prediction signal, the target signal for updating, and the TD-error signal. In both the medial striatum and tegmentum, the majority of recorded neurons were categorized into three types according to their fitness for three models, though these neurons tended to form a continuum spectrum without distinct differences in the firing rate. Specifically, two types of striatal neurons successfully mimicked the target signal and the prediction signal. A linear summation of these two types of striatum neurons was a good fit for the activity of one type of tegmental neurons mimicking the TD-error signal. The present study thus demonstrates that the striatum and tegmentum can convey the signals critically required for the TD method. Based on the theoretical and neurophysiological studies, together with tract-tracing data, we propose a novel model to explain how the convergence of signals represented in the striatum could lead to the computation of TD error in tegmental dopaminergic neurons.

  2. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017....... Coding Class projektet er et pilotprojekt, hvor en række skoler i København og Vejle kommuner har igangsat undervisningsaktiviteter med fokus på kodning og programmering i skolen. Evalueringen og dokumentationen af projektet omfatter kvalitative nedslag i udvalgte undervisningsinterventioner i efteråret...

  3. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  4. 2 CFR 1.215 - Relationship to previous issuances.

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Relationship to previous issuances. 1.215 Section 1.215 Grants and Agreements ABOUT TITLE 2 OF THE CODE OF FEDERAL REGULATIONS AND SUBTITLE A Introduction toSubtitle A § 1.215 Relationship to previous issuances. Although some of the guidance was...

  5. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    Science.gov (United States)

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  6. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  7. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  8. Code quality issues in student programs

    NARCIS (Netherlands)

    Keuning, H.W.; Heeren, B.J.; Jeuring, J.T.

    2017-01-01

    Because low quality code can cause serious problems in software systems, students learning to program should pay attention to code quality early. Although many studies have investigated mistakes that students make during programming, we do not know much about the quality of their code. This study

  9. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  10. ANIMAL code

    Energy Technology Data Exchange (ETDEWEB)

    Lindemuth, I.R.

    1979-02-28

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables.

  11. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  12. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  13. Expander Codes

    Indian Academy of Sciences (India)

    Codes and Channels. A noisy communication channel is illustrated in Fig- ... nication channel. Suppose we want to transmit a message over the unreliable communication channel so that even if the channel corrupts some of the bits we are able to recover ..... is d-regular, meaning thereby that every vertex has de- gree d.

  14. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  15. Network Coding

    Indian Academy of Sciences (India)

    Network coding is a technique to increase the amount of information °ow in a network by mak- ing the key observation that information °ow is fundamentally different from commodity °ow. Whereas, under traditional methods of opera- tion of data networks, intermediate nodes are restricted to simply forwarding their incoming.

  16. Research and Trends in the Field of Technology-Enhanced Learning from 2006 to 2011: A Content Analysis of Quick Response Code (QR-Code) and Its Application in Selected Studies

    Science.gov (United States)

    Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali

    2013-01-01

    This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…

  17. Computer codes in nuclear safety, radiation transport and dosimetry

    International Nuclear Information System (INIS)

    Bordy, J.M.; Kodeli, I.; Menard, St.; Bouchet, J.L.; Renard, F.; Martin, E.; Blazy, L.; Voros, S.; Bochud, F.; Laedermann, J.P.; Beaugelin, K.; Makovicka, L.; Quiot, A.; Vermeersch, F.; Roche, H.; Perrin, M.C.; Laye, F.; Bardies, M.; Struelens, L.; Vanhavere, F.; Gschwind, R.; Fernandez, F.; Quesne, B.; Fritsch, P.; Lamart, St.; Crovisier, Ph.; Leservot, A.; Antoni, R.; Huet, Ch.; Thiam, Ch.; Donadille, L.; Monfort, M.; Diop, Ch.; Ricard, M.

    2006-01-01

    The purpose of this conference was to describe the present state of computer codes dedicated to radiation transport or radiation source assessment or dosimetry. The presentations have been parted into 2 sessions: 1) methodology and 2) uses in industrial or medical or research domains. It appears that 2 different calculation strategies are prevailing, both are based on preliminary Monte-Carlo calculations with data storage. First, quick simulations made from a database of particle histories built though a previous Monte-Carlo simulation and secondly, a neuronal approach involving a learning platform generated through a previous Monte-Carlo simulation. This document gathers the slides of the presentations

  18. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  19. Computer codes in nuclear safety, radiation transport and dosimetry; Les codes de calcul en radioprotection, radiophysique et dosimetrie

    Energy Technology Data Exchange (ETDEWEB)

    Bordy, J.M.; Kodeli, I.; Menard, St.; Bouchet, J.L.; Renard, F.; Martin, E.; Blazy, L.; Voros, S.; Bochud, F.; Laedermann, J.P.; Beaugelin, K.; Makovicka, L.; Quiot, A.; Vermeersch, F.; Roche, H.; Perrin, M.C.; Laye, F.; Bardies, M.; Struelens, L.; Vanhavere, F.; Gschwind, R.; Fernandez, F.; Quesne, B.; Fritsch, P.; Lamart, St.; Crovisier, Ph.; Leservot, A.; Antoni, R.; Huet, Ch.; Thiam, Ch.; Donadille, L.; Monfort, M.; Diop, Ch.; Ricard, M

    2006-07-01

    The purpose of this conference was to describe the present state of computer codes dedicated to radiation transport or radiation source assessment or dosimetry. The presentations have been parted into 2 sessions: 1) methodology and 2) uses in industrial or medical or research domains. It appears that 2 different calculation strategies are prevailing, both are based on preliminary Monte-Carlo calculations with data storage. First, quick simulations made from a database of particle histories built though a previous Monte-Carlo simulation and secondly, a neuronal approach involving a learning platform generated through a previous Monte-Carlo simulation. This document gathers the slides of the presentations.

  20. Abiding by codes of ethics and codes of conduct imposed on members of learned and professional geoscience institutions and - a tiresome formality or a win-win for scientific and professional integrity and protection of the public?

    Science.gov (United States)

    Allington, Ruth; Fernandez, Isabel

    2015-04-01

    In 2012, the International Union of Geological Sciences (IUGS) formed the Task Group on Global Geoscience Professionalism ("TG-GGP") to bring together the expanding network of organizations around the world whose primary purpose is self-regulation of geoscience practice. An important part of TG-GGP's mission is to foster a shared understanding of aspects of professionalism relevant to individual scientists and applied practitioners working in one or more sectors of the wider geoscience profession (e.g. research, teaching, industry, geoscience communication and government service). These may be summarised as competence, ethical practice, and professional, technical and scientific accountability. Legal regimes for the oversight of registered or licensed professionals differ around the world and in many jurisdictions there is no registration or licensure with the force of law. However, principles of peer-based self-regulation universally apply. This makes professional geoscience organisations ideal settings within which geoscientists can debate and agree what society should expect of us in the range of roles we fulfil. They can provide the structures needed to best determine what expectations, in the public interest, are appropriate for us collectively to impose on each other. They can also provide the structures for the development of associated procedures necessary to identify and discipline those who do not live up to the expected standards of behaviour established by consensus between peers. Codes of Ethics (sometimes referred to as Codes of Conduct), to which all members of all major professional and/or scientific geoscience organizations are bound (whether or not they are registered or hold professional qualifications awarded by those organisations), incorporate such traditional tenets as: safeguarding the health and safety of the public, scientific integrity, and fairness. Codes also increasingly include obligations concerning welfare of the environment and

  1. Biased ART: a neural architecture that shifts attention toward previously disregarded features following an incorrect prediction.

    Science.gov (United States)

    Carpenter, Gail A; Gaddam, Sai Chaitanya

    2010-04-01

    Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Two-dimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/. Copyright 2009 Elsevier Ltd. All rights reserved.

  2. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  3. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu

    2013-12-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  4. Efficient convolutional sparse coding

    Energy Technology Data Exchange (ETDEWEB)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  5. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  6. Stakeholders' Opinions on the use of Code Switching/ Code Mixing ...

    African Journals Online (AJOL)

    English-Kiswahili code switching is employedintensively in the classrooms by both teachers and learners, as a coping strategy toattain meaningful learning. This practice is not permitted officially in Tanzaniaeven though it may be the only possible strategy at the moment to move away fromthe difficulty faced in using English ...

  7. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    Science.gov (United States)

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding. © The Author(s) 2015.

  8. QR CODES IN EDUCATION AND COMMUNICATION

    Directory of Open Access Journals (Sweden)

    Gurhan DURAK

    2016-04-01

    Full Text Available Technological advances brought applications of innovations to education. Conventional education increasingly flourishes with new technologies accompanied by more learner active environments. In this continuum, there are learners preferring self-learning. Traditional learning materials yield attractive, motivating and technologically enhanced learning materials. The QR (Quick Response Codes are one of these innovations. The aim of this study is to redesign a lesson unit supported with QR Codes and to get the learner views about the redesigned material. For this purpose, the redesigned lesson unit was delivered to 15 learners in Balıkesir University in the academic year of 2013-2014. The learners were asked to study the material. The learners who had smart phones and Internet access were chosen for the study. To provide sectional diversity, three groups were created. The group learners were from Faculty of Education, Faculty of Science and Literature and Faculty of Engineering. After the semi-structured interviews were held, the learners were asked about their pre-knowledge about QR Codes, QR Codes’ contribution to learning, difficulties with using QR Codes about and design issues. Descriptive data analysis was used in the study. The findings were interpreted on the basis of Theory of Diffusion of Innovations and Theory of Uses and Gratifications. After the research, the themes found were awareness of QR Code, types of QR Codes and applications, contributions to learning, and proliferation of QR Codes. Generally, the learners participating in the study reported that they were aware of QR Codes; that they could use the QR Codes; and that using QR Codes in education was useful. They also expressed that such features as visual elements, attractiveness and direct routing had positive impact on learning. In addition, they generally mentioned that they did not have any difficulty using QR Codes; that they liked the design; and that the content should

  9. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  10. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-04-11

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaickingand 4D light field view synthesis.

  11. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-12-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  12. Coding isotropic images

    Science.gov (United States)

    Oneal, J. B., Jr.; Natarajan, T. R.

    1976-01-01

    Rate distortion functions for two-dimensional homogeneous isotropic images are compared with the performance of 5 source encoders designed for such images. Both unweighted and frequency weighted mean square error distortion measures are considered. The coders considered are differential PCM (DPCM) using six previous samples in the prediction, herein called 6 pel (picutre element) DPCM; simple DPCM using single sample prediction; 6 pel DPCM followed by entropy coding; 8 x 8 discrete cosine transform coder, and 4 x 4 Hadamard transform coder. Other transform coders were studied and found to have about the same performance as the two transform coders above. With the mean square error distortion measure DPCM with entropy coding performed best. The relative performance of the coders changes slightly when the distortion measure is frequency weighted mean square error. The performance of all the coders was separated by only about 4 dB.

  13. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  14. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  15. The Flutter Shutter Code Calculator

    Directory of Open Access Journals (Sweden)

    Yohann Tendero

    2015-08-01

    Full Text Available The goal of the flutter shutter is to make uniform motion blur invertible, by a"fluttering" shutter that opens and closes on a sequence of well chosen sub-intervals of the exposure time interval. In other words, the photon flux is modulated according to a well chosen sequence calledflutter shutter code. This article provides a numerical method that computes optimal flutter shutter codes in terms of mean square error (MSE. We assume that the observed objects follow a known (or learned random velocity distribution. In this paper, Gaussian and uniform velocity distributions are considered. Snapshots are also optimized taking the velocity distribution into account. For each velocity distribution, the gain of the optimal flutter shutter code with respectto the optimal snapshot in terms of MSE is computed. This symmetric optimization of theflutter shutter and of the snapshot allows to compare on an equal footing both solutions, i.e. camera designs. Optimal flutter shutter codes are demonstrated to improve substantially the MSE compared to classic (patented or not codes. A numerical method that permits to perform a reverse engineering of any existing (patented or not flutter shutter codes is also describedand an implementation is given. In this case we give the underlying velocity distribution fromwhich a given optimal flutter shutter code comes from. The combination of these two numerical methods furnishes a comprehensive study of the optimization of a flutter shutter that includes a forward and a backward numerical solution.

  16. Low Rank Sparse Coding for Image Classification

    Science.gov (United States)

    2013-12-08

    correlations among different local features to obtain better coding results than learning each feature individual- ly. To the best of our knowledge...same image. However, it only does so to constrain codebook selection. It fails to directly enforce consistency on codes themselves. 283 To the best ...has a time complexity of O(m2nd), which is signficantly slower than our coding method. In practise , we observe that LRSC is usually about 4 times

  17. Cinder begin creative coding

    CERN Document Server

    Rijnieks, Krisjanis

    2013-01-01

    Presented in an easy to follow, tutorial-style format, this book will lead you step-by-step through the multi-faceted uses of Cinder.""Cinder: Begin Creative Coding"" is for people who already have experience in programming. It can serve as a transition from a previous background in Processing, Java in general, JavaScript, openFrameworks, C++ in general or ActionScript to the framework covered in this book, namely Cinder. If you like quick and easy to follow tutorials that will let yousee progress in less than an hour - this book is for you. If you are searching for a book that will explain al

  18. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  19. Orthogonal spectral coding of entangled photons.

    Science.gov (United States)

    Lukens, Joseph M; Dezfooliyan, Amir; Langrock, Carsten; Fejer, Martin M; Leaird, Daniel E; Weiner, Andrew M

    2014-04-04

    We extend orthogonal optical coding, previously applied to multiuser classical communication networks, to entangled photons. Using a pulse shaper and sum-frequency generation for ultrafast coincidence detection, we demonstrate encoding and decoding of biphoton wave packets. Applying one code to the signal photon spreads the wave packet in time and creates a null at zero delay; filtering the idler with the matched code recovers a narrow correlation peak, whereas applying any other code leaves the wave packet spread. Our results could prove useful in the development of code-based quantum communication networks.

  20. On Block Security of Regenerating Codes at the MBR Point for Distributed Storage Systems

    OpenAIRE

    Dau, Son Hoang; Song, Wentu; Yuen, Chau

    2013-01-01

    A passive adversary can eavesdrop stored content or downloaded content of some storage nodes, in order to learn illegally about the file stored across a distributed storage system (DSS). Previous work in the literature focuses on code constructions that trade storage capacity for perfect security. In other words, by decreasing the amount of original data that it can store, the system can guarantee that the adversary, which eavesdrops up to a certain number of storage nodes, obtains no informa...

  1. Prospective Coding by Spiking Neurons.

    Directory of Open Access Journals (Sweden)

    Johanni Brea

    2016-06-01

    Full Text Available Animals learn to make predictions, such as associating the sound of a bell with upcoming feeding or predicting a movement that a motor command is eliciting. How predictions are realized on the neuronal level and what plasticity rule underlies their learning is not well understood. Here we propose a biologically plausible synaptic plasticity rule to learn predictions on a single neuron level on a timescale of seconds. The learning rule allows a spiking two-compartment neuron to match its current firing rate to its own expected future discounted firing rate. For instance, if an originally neutral event is repeatedly followed by an event that elevates the firing rate of a neuron, the originally neutral event will eventually also elevate the neuron's firing rate. The plasticity rule is a form of spike timing dependent plasticity in which a presynaptic spike followed by a postsynaptic spike leads to potentiation. Even if the plasticity window has a width of 20 milliseconds, associations on the time scale of seconds can be learned. We illustrate prospective coding with three examples: learning to predict a time varying input, learning to predict the next stimulus in a delayed paired-associate task and learning with a recurrent network to reproduce a temporally compressed version of a sequence. We discuss the potential role of the learning mechanism in classical trace conditioning. In the special case that the signal to be predicted encodes reward, the neuron learns to predict the discounted future reward and learning is closely related to the temporal difference learning algorithm TD(λ.

  2. Impacts of Model Building Energy Codes

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sivaraman, Deepak [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Douglas B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  3. Model Children's Code.

    Science.gov (United States)

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  4. Affine Grassmann codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Beelen, Peter; Ghorpade, Sudhir Ramakant

    2010-01-01

    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes.We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show that af...

  5. Dopamine Modulates Adaptive Prediction Error Coding in the Human Midbrain and Striatum.

    Science.gov (United States)

    Diederen, Kelly M J; Ziauddeen, Hisham; Vestergaard, Martin D; Spencer, Tom; Schultz, Wolfram; Fletcher, Paul C

    2017-02-15

    Learning to optimally predict rewards requires agents to account for fluctuations in reward value. Recent work suggests that individuals can efficiently learn about variable rewards through adaptation of the learning rate, and coding of prediction errors relative to reward variability. Such adaptive coding has been linked to midbrain dopamine neurons in nonhuman primates, and evidence in support for a similar role of the dopaminergic system in humans is emerging from fMRI data. Here, we sought to investigate the effect of dopaminergic perturbations on adaptive prediction error coding in humans, using a between-subject, placebo-controlled pharmacological fMRI study with a dopaminergic agonist (bromocriptine) and antagonist (sulpiride). Participants performed a previously validated task in which they predicted the magnitude of upcoming rewards drawn from distributions with varying SDs. After each prediction, participants received a reward, yielding trial-by-trial prediction errors. Under placebo, we replicated previous observations of adaptive coding in the midbrain and ventral striatum. Treatment with sulpiride attenuated adaptive coding in both midbrain and ventral striatum, and was associated with a decrease in performance, whereas bromocriptine did not have a significant impact. Although we observed no differential effect of SD on performance between the groups, computational modeling suggested decreased behavioral adaptation in the sulpiride group. These results suggest that normal dopaminergic function is critical for adaptive prediction error coding, a key property of the brain thought to facilitate efficient learning in variable environments. Crucially, these results also offer potential insights for understanding the impact of disrupted dopamine function in mental illness. SIGNIFICANCE STATEMENT To choose optimally, we have to learn what to expect. Humans dampen learning when there is a great deal of variability in reward outcome, and two brain regions that

  6. Mean-based neural coding of voices.

    Science.gov (United States)

    Andics, Attila; McQueen, James M; Petersson, Karl Magnus

    2013-10-01

    The social significance of recognizing the person who talks to us is obvious, but the neural mechanisms that mediate talker identification are unclear. Regions along the bilateral superior temporal sulcus (STS) and the inferior frontal cortex (IFC) of the human brain are selective for voices, and they are sensitive to rapid voice changes. Although it has been proposed that voice recognition is supported by prototype-centered voice representations, the involvement of these category-selective cortical regions in the neural coding of such "mean voices" has not previously been demonstrated. Using fMRI in combination with a voice identity learning paradigm, we show that voice-selective regions are involved in the mean-based coding of voice identities. Voice typicality is encoded on a supra-individual level in the right STS along a stimulus-dependent, identity-independent (i.e., voice-acoustic) dimension, and on an intra-individual level in the right IFC along a stimulus-independent, identity-dependent (i.e., voice identity) dimension. Voice recognition therefore entails at least two anatomically separable stages, each characterized by neural mechanisms that reference the central tendencies of voice categories. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  8. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  9. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  10. Expressing Youth Voice through Video Games and Coding

    Science.gov (United States)

    Martin, Crystle

    2017-01-01

    A growing body of research focuses on the impact of video games and coding on learning. The research often elevates learning the technical skills associated with video games and coding or the importance of problem solving and computational thinking, which are, of course, necessary and relevant. However, the literature less often explores how young…

  11. 76 FR 57795 - Agency Request for Renewal of a Previously Approved Collection; Disclosure of Code Sharing...

    Science.gov (United States)

    2011-09-16

    ..., schedule, or itinerary and unless they know the identity of the airline on which they will be flying. The... corporate name and any other name under which that service is held out to the public. Respondents: All U.S...

  12. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  13. Error Correcting Codes

    Indian Academy of Sciences (India)

    sound quality is, in essence, obtained by accurate waveform coding and decoding of the audio signals. In addition, the coded audio information is protected against disc errors by the use of a Cross Interleaved Reed-Solomon Code (CIRC). Reed-. Solomon codes were discovered by Irving Reed and Gus Solomon in 1960.

  14. Placental complications after a previous cesarean section

    OpenAIRE

    Milošević Jelena; Lilić Vekoslav; Tasić Marija; Radović-Janošević Dragana; Stefanović Milan; Antić Vladimir

    2009-01-01

    Introduction The incidence of cesarean section has been rising in the past 50 years. With the increased number of cesarean sections, the number of pregnancies with the previous cesarean section rises as well. The aim of this study was to establish the influence of the previous cesarean section on the development of placental complications: placenta previa, placental abruption and placenta accreta, as well as to determine the influence of the number of previous cesarean sections on the complic...

  15. Network Coding Taxonomy

    OpenAIRE

    Adamson , Brian; Adjih , Cédric; Bilbao , Josu; Firoiu , Victor; Fitzek , Frank; Samah , Ghanem ,; Lochin , Emmanuel; Masucci , Antonia; Montpetit , Marie-Jose; Pedersen , Morten V.; Peralta , Goiuri; Roca , Vincent; Paresh , Saxena; Sivakumar , Senthil

    2017-01-01

    Internet Research Task Force - Working document of the Network Coding Research Group (NWCRG), draft-irtf-nwcrg-network-coding-taxonomy-05 (work in progress), https://datatracker.ietf.org/doc/draft-irtf-nwcrg-network-coding-taxonomy/; This document summarizes a recommended terminology for Network Coding concepts and constructs. It provides a comprehensive set of terms with unique names in order to avoid ambiguities in future Network Coding IRTF and IETF documents. This document is intended to ...

  16. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  17. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    1. Introduction. Shannon's landmark paper 'A Mathematical Theory of. Communication' [1] laid the foundation for communica- ... coding theory, codes over graphs and iterative techniques, and informa- tion theory. .... An important consequence of independence is that if. {Xb X2 , . Xn} are independent random variables, each.

  18. Turbo Codes Extended with Outer BCH Code

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1996-01-01

    The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...

  19. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    Science.gov (United States)

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  20. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  1. Automatic electromagnetic valve for previous vacuum

    International Nuclear Information System (INIS)

    Granados, C. E.; Martin, F.

    1959-01-01

    A valve which permits the maintenance of an installation vacuum when electric current fails is described. It also lets the air in the previous vacuum bomb to prevent the oil ascending in the vacuum tubes. (Author)

  2. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  3. Model and code development

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed

  4. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  5. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  6. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  7. Concomitant and previous osteoporotic vertebral fractures.

    Science.gov (United States)

    Lenski, Markus; Büser, Natalie; Scherer, Michael

    2017-04-01

    Background and purpose - Patients with osteoporosis who present with an acute onset of back pain often have multiple fractures on plain radiographs. Differentiation of an acute osteoporotic vertebral fracture (AOVF) from previous fractures is difficult. The aim of this study was to investigate the incidence of concomitant AOVFs and previous OVFs in patients with symptomatic AOVFs, and to identify risk factors for concomitant AOVFs. Patients and methods - This was a prospective epidemiological study based on the Registry of Pathological Osteoporotic Vertebral Fractures (REPAPORA) with 1,005 patients and 2,874 osteoporotic vertebral fractures, which has been running since February 1, 2006. Concomitant fractures are defined as at least 2 acute short-tau inversion recovery (STIR-) positive vertebral fractures that happen concomitantly. A previous fracture is a STIR-negative fracture at the time of initial diagnostics. Logistic regression was used to examine the influence of various variables on the incidence of concomitant fractures. Results - More than 99% of osteoporotic vertebral fractures occurred in the thoracic and lumbar spine. The incidence of concomitant fractures at the time of first patient contact was 26% and that of previous fractures was 60%. The odds ratio (OR) for concomitant fractures decreased with a higher number of previous fractures (OR =0.86; p = 0.03) and higher dual-energy X-ray absorptiometry T-score (OR =0.72; p = 0.003). Interpretation - Concomitant and previous osteoporotic vertebral fractures are common. Risk factors for concomitant fractures are a low T-score and a low number of previous vertebral fractures in cases of osteoporotic vertebral fracture. An MRI scan of the the complete thoracic and lumbar spine with STIR sequence reduces the risk of under-diagnosis and under-treatment.

  8. Hello Ruby adventures in coding

    CERN Document Server

    Liukas, Linda

    2015-01-01

    "Code is the 21st century literacy and the need for people to speak the ABCs of Programming is imminent." --Linda Liukas Meet Ruby--a small girl with a huge imagination. In Ruby's world anything is possible if you put your mind to it. When her dad asks her to find five hidden gems Ruby is determined to solve the puzzle with the help of her new friends, including the Wise Snow Leopard, the Friendly Foxes, and the Messy Robots. As Ruby stomps around her world kids will be introduced to the basic concepts behind coding and programming through storytelling. Learn how to break big problems into small problems, repeat tasks, look for patterns, create step-by-step plans, and think outside the box. With hands-on activities included in every chapter, future coders will be thrilled to put their own imaginations to work.

  9. ARC Code TI: CODE Software Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — CODE is a software framework for control and observation in distributed environments. The basic functionality of the framework allows a user to observe a distributed...

  10. ARC Code TI: ROC Curve Code Augmentation

    Data.gov (United States)

    National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...

  11. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  12. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    the reading of data from memory the receiving process. Protecting data in computer memories was one of the earliest applications of Hamming codes. We now describe the clever scheme invented by Hamming in 1948. To keep things simple, we describe the binary length 7 Hamming code. Encoding in the Hamming Code.

  13. Performance analysis of WS-EWC coded optical CDMA networks with/without LDPC codes

    Science.gov (United States)

    Huang, Chun-Ming; Huang, Jen-Fa; Yang, Chao-Chin

    2010-10-01

    One extended Welch-Costas (EWC) code family for the wavelength-division-multiplexing/spectral-amplitude coding (WDM/SAC; WS) optical code-division multiple-access (OCDMA) networks is proposed. This system has a superior performance as compared to the previous modified quadratic congruence (MQC) coded OCDMA networks. However, since the performance of such a network is unsatisfactory when the data bit rate is higher, one class of quasi-cyclic low-density parity-check (QC-LDPC) code is adopted to improve that. Simulation results show that the performance of the high-speed WS-EWC coded OCDMA network can be greatly improved by using the LDPC codes.

  14. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  15. Factores socioacadémicos, estilo de aprendizaje, nivel intelectual y su relación con el rendimiento académico previo de médicos internos de pregrado Socioacademic factors, style of learning, intellectual level and their relationship with the previous academic yield of medical students

    Directory of Open Access Journals (Sweden)

    J.L. Padierna-Luna

    2009-06-01

    Full Text Available Introducción. El aprendizaje es una actividad compleja, en la que intervienen factores individuales, sociales-culturales y académicos. Objetivo. Describir factores socioacadémicos, estilos de aprendizaje, nivel intelectual y su relación con el rendimiento académico previo (promedio de los médicos internos de pregrado (MIP. Sujetos y métodos. Se realizó una encuesta transversal analítica a MIP aplicando tres cuestionarios: datos socioacadémicos, cuestionario CHAEA (estilos de aprendizaje y test de Raven para adultos (nivel intelectual. La muestra incluyó 174 alumnos procedentes de nueve universidades, tres privadas (n = 43; 24,7% y seis públicas (n = 131; 75,29%. Se utilizó estadística descriptiva y regresión múltiple para establecer asociaciones entre variables. Resultados y conclusiones. Predominó el género femenino, con el 59,2% (n = 103, frente al masculino, con el 40,8% (n = 71. El promedio de edad fue de 23,63 años, con un rango de 21 a 33. No hubo diferencias significativas de género en el rendimiento previo (8,21 frente a 8,25. Se midió la relación entre los factores socioacadémicos, estilos de aprendizaje y nivel intelectual con el rendimiento académico, con un intervalo de confianza del 95%. De los datos socioacadémicos, sólo la edad se relaciona inversamente con el rendimiento, con r = 0,2 y p Introduction. The learning is a complex activity, in that takes part individual factors, social and academic factors, among others. Aim. To describe socioacademic factors, styles of learning (SL, intellectual level (IL and its relation with previous academic yield (academic average of the Pre-degree interns (PDI. Subjects and methods. A cross-sectional and analytic survey to PDI was applied with three questionnaires: socioacademic data, adult questionnaire CHAEA (SL and test Raven (IL. The sample included 174 students coming from nine universities, three private (n = 43; 24.7% and six public (n = 131; 75.29%. It was used

  16. Uterine rupture without previous caesarean delivery

    DEFF Research Database (Denmark)

    Thisted, Dorthe L. A.; H. Mortensen, Laust; Krebs, Lone

    2015-01-01

    OBJECTIVE: To determine incidence and patient characteristics of women with uterine rupture during singleton births at term without a previous caesarean delivery. STUDY DESIGN: Population based cohort study. Women with term singleton birth, no record of previous caesarean delivery and planned...... vaginal delivery (n=611,803) were identified in the Danish Medical Birth Registry (1997-2008). Medical records from women recorded with uterine rupture during labour were reviewed to ascertain events of complete uterine rupture. Relative Risk (RR) and adjusted Relative Risk Ratio (aRR) of complete uterine...... rupture with 95% confidence intervals (95% CI) were ascertained according to characteristics of the women and of the delivery. RESULTS: We identified 20 cases with complete uterine rupture. The incidence of complete uterine rupture among women without previous caesarean delivery was about 3...

  17. An Evaluation of Automated Code Generation with the PetriCode Approach

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    2014-01-01

    Automated code generation is an important element of model driven development methodologies. We have previously proposed an approach for code generation based on Coloured Petri Net models annotated with textual pragmatics for the network protocol domain. In this paper, we present and evaluate three...

  18. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  19. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    , Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  20. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  1. On affine variety codes from the Klein quartic

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Ozbudak, Ferruh

    2018-01-01

    We study a family of primary affine variety codes defined from the Klein quartic. The duals of these codes have previously been treated in Kolluru et al., (Appl. Algebra Engrg. Comm. Comput. 10(6):433–464, 2000, Ex. 3.2). Among the codes that we construct almost all have parameters as good as the...

  2. EquiFACS: The Equine Facial Action Coding System.

    Directory of Open Access Journals (Sweden)

    Jen Wathan

    Full Text Available Although previous studies of horses have investigated their facial expressions in specific contexts, e.g. pain, until now there has been no methodology available that documents all the possible facial movements of the horse and provides a way to record all potential facial configurations. This is essential for an objective description of horse facial expressions across a range of contexts that reflect different emotional states. Facial Action Coding Systems (FACS provide a systematic methodology of identifying and coding facial expressions on the basis of underlying facial musculature and muscle movement. FACS are anatomically based and document all possible facial movements rather than a configuration of movements associated with a particular situation. Consequently, FACS can be applied as a tool for a wide range of research questions. We developed FACS for the domestic horse (Equus caballus through anatomical investigation of the underlying musculature and subsequent analysis of naturally occurring behaviour captured on high quality video. Discrete facial movements were identified and described in terms of the underlying muscle contractions, in correspondence with previous FACS systems. The reliability of others to be able to learn this system (EquiFACS and consistently code behavioural sequences was high--and this included people with no previous experience of horses. A wide range of facial movements were identified, including many that are also seen in primates and other domestic animals (dogs and cats. EquiFACS provides a method that can now be used to document the facial movements associated with different social contexts and thus to address questions relevant to understanding social cognition and comparative psychology, as well as informing current veterinary and animal welfare practices.

  3. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  4. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  5. INTRODUCTION Previous reports have documented a high ...

    African Journals Online (AJOL)

    pregnancy if they were married, educated, had dental insurance, previously used dental services when not pregnant, or had knowledge about the possible connection between oral health and pregnancy outcome8. The purpose of this study was to explore the factors determining good oral hygiene among pregnant women ...

  6. Empowerment perceptions of educational managers from previously ...

    African Journals Online (AJOL)

    The perceptions of educational manag ers from previously disadvantaged primary and high schools in the Nelson Mandela Metropole regarding the issue of empowerment are outlined and the perceptions of educational managers in terms of various aspects of empowerment at different levels reflected. A literature study ...

  7. Management of choledocholithiasis after previous gastrectomy.

    Science.gov (United States)

    Anwer, S; Egan, R; Cross, N; Guru Naidu, S; Somasekar, K

    2017-09-01

    Common bile duct stones in patients with a previous gastrectomy can be a technical challenge because of the altered anatomy. This paper presents the successful management of two such patients using non-traditional techniques as conventional endoscopic retrograde cholangiopancreatography was not possible.

  8. Laboratory Grouping Based on Previous Courses.

    Science.gov (United States)

    Doemling, Donald B.; Bowman, Douglas C.

    1981-01-01

    In a five-year study, second-year human physiology students were grouped for laboratory according to previous physiology and laboratory experience. No significant differences in course or board examination performance were found, though correlations were found between predental grade-point averages and grouping. (MSE)

  9. Previously unknown organomagnesium compounds in astrochemical context

    OpenAIRE

    Ruf, Alexander

    2018-01-01

    We describe the detection of dihydroxymagnesium carboxylates (CHOMg) in astrochemical context. CHOMg was detected in meteorites via ultrahigh-resolving chemical analytics and represents a novel, previously unreported chemical class. Thus, chemical stability was probed via quantum chemical computations, in combination with experimental fragmentation techniques. Results propose the putative formation of green-chemical OH-Grignard-type molecules and triggered fundamental questions within chemica...

  10. [Placental complications after a previous cesarean section].

    Science.gov (United States)

    Milosević, Jelena; Lilić, Vekoslav; Tasić, Marija; Radović-Janosević, Dragana; Stefanović, Milan; Antić, Vladimir

    2009-01-01

    The incidence of cesarean section has been rising in the past 50 years. With the increased number of cesarean sections, the number of pregnancies with the previous cesarean section rises as well. The aim of this study was to establish the influence of the previous cesarean section on the development of placental complications: placenta previa, placental abruption and placenta accreta, as well as to determine the influence of the number of previous cesarean sections on the complication development. The research was conducted at the Clinic of Gynecology and Obstetrics in Nis covering 10-year-period (from 1995 to 2005) with 32358 deliveries, 1280 deliveries after a previous cesarean section, 131 cases of placenta previa and 118 cases of placental abruption. The experimental groups was presented by the cases of placenta previa or placental abruption with prior cesarean section in obstetrics history, opposite to the control group having the same conditions but without a cesarean section in medical history. The incidence of placenta previa in the control group was 0.33%, opposite to the 1.86% incidence after one cesarean section (pcesarean sections and as high as 14.28% after three cesarean sections in obstetric history. Placental abruption was recorded as placental complication in 0.33% pregnancies in the control group, while its incidence was 1.02% after one cesarean section (pcesarean sections. The difference in the incidence of intrapartal hysterectomy between the group with prior cesarean section (0.86%) and without it (0.006%) shows a high statistical significance (pcesarean section is an important risk factor for the development of placental complications.

  11. Developing an Online "Code of Conduct"

    Science.gov (United States)

    Summerville, Jennifer

    2005-01-01

    There are an increasing number of classes being offered via the World Wide Web. Although much of the information that we review regarding online learning seems positive, difficulties can arise. In particular, the anonymity that a web course can provide can be a blessing and a curse. In this article, the author suggests developing an online code of…

  12. Australasian code for reporting of mineral resources and ore reserves (the JORC code)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The latest revision of the Code first published in 1989 becomes effective in September 1999. It was prepared by the Joint Ores Reserves Committee of the Australasian Institute of Mining and Metallurgy, Australian Institute of Geoscientists and Minerals Council of Australia (JORC). It sets out minimum standards, recommendations and guidelines for public reporting of exploration results, mineral resources and ore reserves in Australasia. In this edition, the guidelines, which were previously separated from the Code, have been placed after the respective Code clauses. The Code is applicable to all solid minerals, including diamonds, other gemstones and coal for which public reporting is required by the Australian and New Zealand Stock Exchanges.

  13. An Outline of the New Norwegian Criminal Code

    Directory of Open Access Journals (Sweden)

    Jørn Jacobsen

    2015-12-01

    Full Text Available This article gives an overview of the new criminal code, its background and content. It maps out the code’s background, the legislative process and central ideas. Furthermore, the article gives an outline of the general criteria for criminal responsibility according to the code, the offences and forms of punishment and other reactions. The article emphasises the most important changes from the previous code of 1902. To some degree, strengths and weaknesses of the new code are addressed.

  14. Women with learning disabilities and access to cervical screening: retrospective cohort study using case control methods

    Science.gov (United States)

    Reynolds, Fiona; Stanistreet, Debbi; Elton, Peter

    2008-01-01

    Background Several studies in the UK have suggested that women with learning disabilities may be less likely to receive cervical screening tests and a previous local study in had found that GPs considered screening unnecessary for women with learning disabilities. This study set out to ascertain whether women with learning disabilities are more likely to be ceased from a cervical screening programme than women without; and to examine the reasons given for ceasing women with learning disabilities. It was carried out in Bury, Heywood-and-Middleton and Rochdale. Methods Carried out using retrospective cohort study methods, women with learning disabilities were identified by Read code; and their cervical screening records were compared with the Call-and-Recall records of women without learning disabilities in order to examine their screening histories. Analysis was carried out using case-control methods – 1:2 (women with learning disabilities: women without learning disabilities), calculating odds ratios. Results 267 women's records were compared with the records of 534 women without learning disabilities. Women with learning disabilities had an odds ratio (OR) of 0.48 (Confidence Interval (CI) 0.38 – 0.58; X2: 72.227; p.value learning disabilities. Conclusion The reasons given for ceasing and/or not screening suggest that merely being coded as having a learning disability is not the sole reason for these actions. There are training needs among smear takers regarding appropriate reasons not to screen and providing screening for women with learning disabilities. PMID:18218106

  15. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  16. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  17. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow t...... the transversal implementation of a universal set of gates by gauge fixing, while error-dectecting measurements involve only four or six qubits....

  18. Doubled Color Codes

    Science.gov (United States)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  19. Phonological coding during reading

    Science.gov (United States)

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  20. MORSE Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  1. Bar Code Labels

    Science.gov (United States)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  2. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  3. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  4. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  5. Mutual Information, Fisher Information, and Efficient Coding.

    Science.gov (United States)

    Wei, Xue-Xin; Stocker, Alan A

    2016-02-01

    Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.

  6. On {\\sigma}-LCD codes

    OpenAIRE

    Carlet, Claude; Mesnager, Sihem; Tang, Chunming; Qi, Yanfeng

    2017-01-01

    Linear complementary pairs (LCP) of codes play an important role in armoring implementations against side-channel attacks and fault injection attacks. One of the most common ways to construct LCP of codes is to use Euclidean linear complementary dual (LCD) codes. In this paper, we first introduce the concept of linear codes with $\\sigma$ complementary dual ($\\sigma$-LCD), which includes known Euclidean LCD codes, Hermitian LCD codes, and Galois LCD codes. As Euclidean LCD codes, $\\sigma$-LCD ...

  7. Locality-Constrained Discriminative Learning and Coding

    Science.gov (United States)

    2015-06-12

    methods, i.e. FDDL [27], DL- RD [16], D2L2R2 [12] and DPL [7]. In each experiment, we keep all the steps the same as that of the baselines except for the...percentage of corruption increases our algorithm perform- s the best constantly. The performance of FDDL as well as DPL , LRC and LDA drops rapidly, by contrast...different number of training samples per class. Training images DPL [7] D2L2R2 [12] DLRD [16] FDDL [27] LRC [18] LDA [2] Ours 5 75.17±1.86 75.96±1.20 76.17

  8. Online Dictionary Learning for Sparse Coding

    Science.gov (United States)

    2009-04-01

    following Bottou (1998), the convergence of quasi- martingales (Fisk, 1965). Our analysis is limited to the basic version of the algorithm, although it...is a quasi- martingale . To do so, denoting by Ft the filtra- tion of the past information, a theorem by Fisk (1965) states that if the positive sum ∑∞ t...1 E[max(E[ut+1 − ut|Ft], 0)] converges, then ut is a quasi- martingale which converges with probability one. Using some results on empirical pro

  9. Reading Comprehension Instruction for Students with Learning Disabilities, 1995-2006: A Meta-Analysis

    Science.gov (United States)

    Berkeley, Sheri; Scruggs, Thomas E.; Mastropieri, Margo A.

    2010-01-01

    Meta-analysis procedures were employed to synthesize findings of research for improving reading comprehension of students with learning disabilities published in the decade following previous meta-analytic investigations. Forty studies, published between 1995 and 2006, were identified and coded. Nearly 2,000 students served as participants.…

  10. On Coding Non-Contiguous Letter Combinations

    Directory of Open Access Journals (Sweden)

    Frédéric eDandurand

    2011-06-01

    Full Text Available Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity.

  11. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  12. Induced vaginal birth after previous caesarean section

    Directory of Open Access Journals (Sweden)

    Akylbek Tussupkaliyev

    2016-11-01

    Full Text Available Introduction The rate of operative birth by Caesarean section is constantly rising. In Kazakhstan, it reaches 27 per cent. Research data confirm that the percentage of successful vaginal births after previous Caesarean section is 50–70 per cent. How safe the induction of vaginal birth after Caesarean (VBAC remains unclear. Methodology The studied techniques of labour induction were amniotomy of the foetal bladder with the vulsellum ramus, intravaginal administration of E1 prostaglandin (Misoprostol, and intravenous infusion of Oxytocin-Richter. The assessment of rediness of parturient canals was conducted by Bishop’s score; the labour course was assessed by a partogram. The effectiveness of labour induction techniques was assessed by the number of administered doses, the time of onset of regular labour, the course of labour and the postpartum period and the presence of complications, and the course of the early neonatal period, which implied the assessment of the child’s condition, described in the newborn development record. The foetus was assessed by medical ultrasound and antenatal and intranatal cardiotocography (CTG. Obtained results were analysed with SAS statistical processing software. Results The overall percentage of successful births with intravaginal administration of Misoprostol was 93 per cent (83 of cases. This percentage was higher than in the amniotomy group (relative risk (RR 11.7 and was similar to the oxytocin group (RR 0.83. Amniotomy was effective in 54 per cent (39 of cases, when it induced regular labour. Intravenous oxytocin infusion was effective in 94 per cent (89 of cases. This percentage was higher than that with amniotomy (RR 12.5. Conclusions The success of vaginal delivery after previous Caesarean section can be achieved in almost 70 per cent of cases. At that, labour induction does not decrease this indicator and remains within population boundaries.

  13. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when...

  14. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    Science, Bangalore. Her interests are in. Theoretical Computer. Science. SERIES I ARTICLE. Error Correcting Codes. 2. The Hamming Codes. Priti Shankar. In the first article of this series we showed how redundancy introduced into a message transmitted over a noisy channel could improve the reliability of transmission. In.

  15. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    set up a well defined goal - that of achieving a per- formance bound set by the noisy channel coding theo- rem, proved in the paper. Whereas the goal appeared elusive twenty five years ago, today, there are practi- cal codes and decoding algorithms that come close to achieving it. It is interesting to note that all known.

  16. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March 1997 pp 33-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/03/0033-0047 ...

  17. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  18. Codes of Conduct

    Science.gov (United States)

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  19. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Department of Computer. Science 'and Automation,. lISe. Their research addresses ... The fifty five year old history of error correcting codes began with Claude Shannon's path-breaking paper en- titled 'A ... given the limited computing power available then, Gal- lager's codes were not considered practical. A landmark.

  20. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  1. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 9. Decoding Codes on Graphs - Low Density Parity Check Codes. A S Madhu Aditya Nori. General Article Volume 8 Issue 9 September 2003 pp 49-59. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. READING A NEURAL CODE

    NARCIS (Netherlands)

    BIALEK, W; RIEKE, F; VANSTEVENINCK, RRD; WARLAND, D

    1991-01-01

    Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task - extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from

  3. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  4. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  5. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2015-01-01

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... their performance from various aspects to better understand their applicability, including computation time, reconstruction error, sparsity, detection...

  6. APPLYING COGNITIVE CODE TOWARDS INDONESIAN EFL LEARNERS’ WRITING COMPETENCE IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Ita Juita

    2014-06-01

    Full Text Available This classroom action research (CAR presents a research for solving the student’s problems in writing class by using two cycles of Kemmis and McTaggart. In this CAR, there are three crucial instruments. They are students’ learning journal to know what the student’ map thinking which is related to the cognitive code and the writing material, researcher’ journal and questionnaire. The students’ problems in writing subject happen in one class of English Department of the University of Kuningan, West Java – Indonesia. The learners find it difficult to process words into sentences. Applying cognitive code in this CAR is the strategy, with the purpose to know what the students need by asking them to use some tools such as student’s learning journal, thus the students are able to tell their difficulties based on their learning experiences in class. Cognitive code looks students or learners as thinking being and learn based on their learning experience. The students’ writing competence in the beginning of this research is 40, meanwhile, after applying cognitive code as the method of teaching learning process, the class average gets 64.5 in the post test. Thus, the normalized gain to measure the students’ writing development is on number 0.7, it means the students’ writing improvement is moderate. The students’ attitude toward cognitive code is taken from rating scales is 82%. Based on the data, it can be concluded that cognitive code is effective method in teaching writing.

  7. Independent code assessment: Sandia-proposed accuracy quantification methodology

    International Nuclear Information System (INIS)

    Kmetyk, L.N.; Elrick, M.G.; Byers, R.K.; Buxton, L.D.

    1986-08-01

    Code assessment is performed to obtain a judgement of the accuracy and validity of any given code over its range of applicability. To achieve this, the codes are exercised against a matrix of experiments, resulting in comparisons between code predictions and measured data. This far, conclusions on code accuracy drawn from previous independent assessment studies have appeared to be mostly phenomenological and/or qualitative. Therefore, there is an increasing emphasis within the NRC code assessment effort to formulate more coherent, quantitative conclusions on the capabilities and accuracies of the codes. This can be done by using statistically based methods to evaluate the overall accuracy of the codes with respect to particular key phenomena, based on the same code analyses of individual experiments. One possibility for a statistically-based code accuracy quantification methodology is presented in this report. This method yields three nested accuracy estimates for any given time period: the code accuracy bias, the average variation in accuracy for the overall behavior. Quantifying assessment results using a common method will allow the results of a number of independent code assessors to be combined to provide broad-based information on code accuracy for applications to regulatory needs and other power plant studies and to define further code development needs and priorities

  8. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  9. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  10. Advanced Code for Photocathode Design

    Energy Technology Data Exchange (ETDEWEB)

    Ives, Robert Lawrence [Calabazas Creek Research, Inc., San Mateo, CA (United States); Jensen, Kevin [Naval Research Lab. (NRL), Washington, DC (United States); Montgomery, Eric [Univ. of Maryland, College Park, MD (United States); Bui, Thuc [Calabazas Creek Research, Inc., San Mateo, CA (United States)

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  11. Laser propagation code study

    OpenAIRE

    Rockower, Edward B.

    1985-01-01

    A number of laser propagation codes have been assessed as to their suitability for modeling Army High Energy Laser (HEL) weapons used in an anti- sensor mode. We identify a number of areas in which systems analysis HEL codes are deficient. Most notably, available HEL scaling law codes model the laser aperture as circular, possibly with a fixed (e.g. 10%) obscuration. However, most HELs have rectangular apertures with up to 30% obscuration. We present a beam-quality/aperture shape scaling rela...

  12. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  13. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  14. Decoding the productivity code

    DEFF Research Database (Denmark)

    Hansen, David

    , that is, the productivity code of the 21st century, is dissolved. Today, organizations are pressured for operational efficiency, often in terms of productivity, due to increased global competition, demographical changes, and use of natural resources. Taylor’s principles for rationalization founded...... that swing between rationalization and employee development. The productivity code is the lack of alternatives to this ineffective approach. This thesis decodes the productivity code based on the results from a 3-year action research study at a medium-sized manufacturing facility. During the project period...

  15. CALIPSOS code report

    International Nuclear Information System (INIS)

    Fanselau, R.W.; Thakkar, J.G.; Hiestand, J.W.; Cassell, D.S.

    1980-04-01

    CALIPSOS is a steady-state three-dimensional flow distribution code which predicts the fluid dynamics and heat transfer interactions of the secondary two-phase flow in a steam generator. The mathematical formulation is sufficiently general to accommodate two fluid models described by separate gas and liquid momentum equations. However, if the user selects the homogeneous flow option, the code automatically equates the gas and liquid phase velocities (thereby reducing the number of momentum equations solved to three) and utilizes a homogeneous density mixture. This report presents the basic features of the CALIPSOS code and includes assumptions, equations solved, the finite-difference grid, and highlights of the solution procedure

  16. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  17. Optimal patch code design via device characterization

    Science.gov (United States)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  18. Interactive QR code beautification with full background image embedding

    Science.gov (United States)

    Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo

    2017-06-01

    QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.

  19. On Identifying which Intermediate Nodes Should Code in Multicast Networks

    DEFF Research Database (Denmark)

    Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel

    2013-01-01

    the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...... intermediate nodes that do require coding is instrumental for the efficient operation of coded networks and can have a significant impact in overall energy consumption. We present a distributed, low complexity algorithm that allows every node to identify if it should code and, if so, through what output link......Network coding has the potential to enhance energy efficiency of multicast sessions by providing optimal communication subgraphs for the transmission of the data. However, the coding requirement at intermediate nodes may introduce additional complexity and energy consumption in order to code...

  20. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  1. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  2. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  3. Coded Random Access

    DEFF Research Database (Denmark)

    Paolini, Enrico; Stefanovic, Cedomir; Liva, Gianluigi

    2015-01-01

    The rise of machine-to-machine communications has rekindled the interest in random access protocols as a support for a massive number of uncoordinatedly transmitting devices. The legacy ALOHA approach is developed under a collision model, where slots containing collided packets are considered...... as waste. However, if the common receiver (e.g., base station) is capable to store the collision slots and use them in a transmission recovery process based on successive interference cancellation, the design space for access protocols is radically expanded. We present the paradigm of coded random access......, in which the structure of the access protocol can be mapped to a structure of an erasure-correcting code defined on graph. This opens the possibility to use coding theory and tools for designing efficient random access protocols, offering markedly better performance than ALOHA. Several instances of coded...

  4. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  5. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  6. MELCOR Accident Consequence Code System (MACCS)

    International Nuclear Information System (INIS)

    Jow, H.N.; Sprung, J.L.; Ritchie, L.T.; Rollstin, J.A.; Chanin, D.I.

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management. 59 refs., 14 figs., 15 tabs

  7. MELCOR Accident Consequence Code System (MACCS)

    Energy Technology Data Exchange (ETDEWEB)

    Jow, H.N.; Sprung, J.L.; Ritchie, L.T. (Sandia National Labs., Albuquerque, NM (USA)); Rollstin, J.A. (GRAM, Inc., Albuquerque, NM (USA)); Chanin, D.I. (Technadyne Engineering Consultants, Inc., Albuquerque, NM (USA))

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management. 59 refs., 14 figs., 15 tabs.

  8. Parallelization of Subchannel Analysis Code MATRA

    International Nuclear Information System (INIS)

    Kim, Seongjin; Hwang, Daehyun; Kwon, Hyouk

    2014-01-01

    A stand-alone calculation of MATRA code used up pertinent computing time for the thermal margin calculations while a relatively considerable time is needed to solve the whole core pin-by-pin problems. In addition, it is strongly required to improve the computation speed of the MATRA code to satisfy the overall performance of the multi-physics coupling calculations. Therefore, a parallel approach to improve and optimize the computability of the MATRA code is proposed and verified in this study. The parallel algorithm is embodied in the MATRA code using the MPI communication method and the modification of the previous code structure was minimized. An improvement is confirmed by comparing the results between the single and multiple processor algorithms. The speedup and efficiency are also evaluated when increasing the number of processors. The parallel algorithm was implemented to the subchannel code MATRA using the MPI. The performance of the parallel algorithm was verified by comparing the results with those from the MATRA with the single processor. It is also noticed that the performance of the MATRA code was greatly improved by implementing the parallel algorithm for the 1/8 core and whole core problems

  9. MELCOR Accident Consequence Code System (MACCS)

    International Nuclear Information System (INIS)

    Rollstin, J.A.; Chanin, D.I.; Jow, H.N.

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previously used CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projections, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. Volume I, the User's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems. Volume II, the Model Description, describes the underlying models that are implemented in the code, and Volume III, the Programmer's Reference Manual, describes the code's structure and database management

  10. Code de conduite

    International Development Research Centre (IDRC) Digital Library (Canada)

    irocca

    le respect de telles normes. Ce faisant, nous contribuons à la bonne réputation et à l'intégrité du Centre et allons dans le sens du Code de valeurs et d'éthique du secteur public du gouvernement du Canada. Je vous invite à prendre connaissance de cette nouvelle mouture du Code de conduite et à appliquer ses principes ...

  11. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  12. Aphasia for Morse code.

    Science.gov (United States)

    Wyler, A R; Ray, M W

    1986-03-01

    The ability to communicate by Morse code at high speed has, to our knowledge, not been localized within the cerebral cortex, but might be suspected as residing within the left (dominant) hemisphere. We report a case of a 54-year-old male who suffered a left temporal tip intracerebral hematoma and who temporarily lost his ability to communicate in Morse code, but who was minimally aphasic.

  13. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  14. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  15. Polynomial weights and code constructions

    DEFF Research Database (Denmark)

    Massey, J; Costello, D; Justesen, Jørn

    1973-01-01

    polynomial included. This fundamental property is then used as the key to a variety of code constructions including 1) a simplified derivation of the binary Reed-Muller codes and, for any primepgreater than 2, a new extensive class ofp-ary "Reed-Muller codes," 2) a new class of "repeated-root" cyclic codes...... that are subcodes of the binary Reed-Muller codes and can be very simply instrumented, 3) a new class of constacyclic codes that are subcodes of thep-ary "Reed-Muller codes," 4) two new classes of binary convolutional codes with large "free distance" derived from known binary cyclic codes, 5) two new classes...... of long constraint length binary convolutional codes derived from2^r-ary Reed-Solomon codes, and 6) a new class ofq-ary "repeated-root" constacyclic codes with an algebraic decoding algorithm....

  16. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  17. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  18. Neural nets for radio Morse code recognizing

    Science.gov (United States)

    Fu, Hsin-Chia; Lin, Y. Y.; Pao, Hsiao-Tien

    1993-09-01

    This paper proposes a neural network recognition system for hand keying Radio Morse codes. The system has been trained and tested on real world data recorded from amateur radio Morse codes. The overall recognizing process can be partitioned into 3 major parts, the preprocessing, the feature extracting, and the character decoding. The whole operation is able to be performed in real-time on a PC/486 system. Self-Organizing Maps are used intensively in the recognition system to adaptively learn the variation of the Morse code signal. The average performance of the recognition system has been achieved about 96.4% with a rejection rate of 6.5%. It is hoped that many of the techniques would be applicable to a wide range of DSP and recognition tasks.

  19. CBP Phase I Code Integration

    International Nuclear Information System (INIS)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-01-01

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown and Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown and Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface

  20. CBP PHASE I CODE INTEGRATION

    Energy Technology Data Exchange (ETDEWEB)

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  1. Assessing the Formation of Experience-Based Gender Expectations in an Implicit Learning Scenario

    Directory of Open Access Journals (Sweden)

    Anton Öttl

    2017-09-01

    Full Text Available The present study investigates the formation of new word-referent associations in an implicit learning scenario, using a gender-coded artificial language with spoken words and visual referents. Previous research has shown that when participants are explicitly instructed about the gender-coding system underlying an artificial lexicon, they monitor the frequency of exposure to male vs. female referents within this lexicon, and subsequently use this probabilistic information to predict the gender of an upcoming referent. In an explicit learning scenario, the auditory and visual gender cues are necessarily highlighted prior to acqusition, and the effects previously observed may therefore depend on participants' overt awareness of these cues. To assess whether the formation of experience-based expectations is dependent on explicit awareness of the underlying coding system, we present data from an experiment in which gender-coding was acquired implicitly, thereby reducing the likelihood that visual and auditory gender cues are used strategically during acquisition. Results show that even if the gender coding system was not perfectly mastered (as reflected in the number of gender coding errors, participants develop frequency based expectations comparable to those previously observed in an explicit learning scenario. In line with previous findings, participants are quicker at recognizing a referent whose gender is consistent with an induced expectation than one whose gender is inconsistent with an induced expectation. At the same time however, eyetracking data suggest that these expectations may surface earlier in an implicit learning scenario. These findings suggest that experience-based expectations are robust against manner of acquisition, and contribute to understanding why similar expectations observed in the activation of stereotypes during the processing of natural language stimuli are difficult or impossible to suppress.

  2. Rates of induced abortion in Denmark according to age, previous births and previous abortions

    Directory of Open Access Journals (Sweden)

    Marie-Louise H. Hansen

    2009-11-01

    Full Text Available Background: Whereas the effects of various socio-demographic determinants on a woman's risk of having an abortion are relatively well-documented, less attention has been given to the effect of previous abortions and births. Objective: To study the effect of previous abortions and births on Danish women's risk of an abortion, in addition to a number of demographic and personal characteristics. Data and methods: From the Fertility of Women and Couples Dataset we obtained data on the number of live births and induced abortions by year (1981-2001, age (16-39, county of residence and marital status. Logistic regression analysis was used to estimate the influence of the explanatory variables on the probability of having an abortion in a relevant year. Main findings and conclusion: A woman's risk of having an abortion increases with the number of previous births and previous abortions. Some interactions were was found in the way a woman's risk of abortion varies with calendar year, age and parity. The risk of an abortion for women with no children decreases while the risk of an abortion for women with children increases over time. Furthermore, the risk of an abortion decreases with age, but relatively more so for women with children compared to childless women. Trends for teenagers are discussed in a separate section.

  3. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  4. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  5. Blind Signal Classification via Spare Coding

    Science.gov (United States)

    2016-04-10

    is intelligent decision making that can enhance resiliency against a hostile, fiercely competing radio environ- ment. There has been significant...feature domain . In the second scenario, our approach is based on the semi-supervised learning framework. A . Sparse Coding Setup Our view on sparse...ll.mit.edu Carl Fossa MIT Lincoln Laboratory cfossa@ll.mit.edu H. T. Kung Harvard University kung@harvard.edu Abstract—We propose a novel RF signal

  6. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  7. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  8. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  9. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  10. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....

  11. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  12. Re-estimation of motion and reconstruction for distributed video coding.

    Science.gov (United States)

    Van Luong, Huynh; Rakêt, Lars Lau; Forchhammer, Søren

    2014-07-01

    Transform domain Wyner-Ziv (TDWZ) video coding is an efficient approach to distributed video coding (DVC), which provides low complexity encoding by exploiting the source statistics at the decoder side. The DVC coding efficiency depends mainly on side information and noise modeling. This paper proposes a motion re-estimation technique based on optical flow to improve side information and noise residual frames by taking partially decoded information into account. To improve noise modeling, a noise residual motion re-estimation technique is proposed. Residual motion compensation with motion updating is used to estimate a current residue based on previously decoded frames and correlation between estimated side information frames. In addition, a generalized reconstruction algorithm to optimize a multihypothesis reconstruction is proposed. The proposed techniques using motion and reconstruction re-estimation (MORE) are integrated in the SING TDWZ codec, which uses side information and noise learning. For Wyner-Ziv frames using GOP size 2, the MORE codec significantly improves the TDWZ coding efficiency with an average (Bjøntegaard) PSNR improvement of 2.5 dB and up to 6 dB improvement compared with DISCOVER.

  13. Graph Codes with Reed-Solomon Component Codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Justesen, Jørn

    2006-01-01

    We treat a specific case of codes based on bipartite expander graphs coming from finite geometries. The code symbols are associated with the branches and the symbols connected to a given node are restricted to be codewords in a Reed-Solomon code. We give results on the parameters of the codes...

  14. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  15. Congruency sequence effects are driven by previous-trial congruency, not previous-trial response conflict

    OpenAIRE

    Weissman, Daniel H.; Carp, Joshua

    2013-01-01

    Congruency effects in distracter interference tasks are often smaller after incongruent trials than after congruent trials. However, the sources of such congruency sequence effects (CSEs) are controversial. The conflict monitoring model of cognitive control links CSEs to the detection and resolution of response conflict. In contrast, competing theories attribute CSEs to attentional or affective processes that vary with previous-trial congruency (incongruent vs. congruent). The present study s...

  16. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  17. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente

    2013-01-01

    A classic way to choose a supplier is through a bidding process where tenders from competing companies are evaluated in relation to the customer’s requirements. If the customer wants to hire an agile software developing team instead of buying a software product, a new approach for comparing tenders...... is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  18. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  19. Learning by Doing: Twenty Successful Active Learning Exercises for Information Systems Courses

    Directory of Open Access Journals (Sweden)

    Alanah Mitchell

    2017-01-01

    Full Text Available Aim/Purpose: This paper provides a review of previously published work related to active learning in information systems (IS courses. Background: There are a rising number of strategies in higher education that offer promise in regards to getting students’ attention and helping them learn, such as flipped classrooms and offering courses online. These learning strategies are part of the pedagogical technique known as active learning. Active learning is a strategy that became popular in the early 1990s and has proven itself as a valid tool for helping students to be engaged with learning. Methodology: This work follows a systematic method for identifying and coding previous research based on an aspect of interest. The authors identified and assessed research through a search of ABI/Inform scholarly journal abstracts and keywords, as well as additional research databases, using the search terms “active learning” and “information systems” from 2000 through June 2016. Contribution: This synthesis of active learning exercises provides guidance for information technology faculty looking to implement active learning strategies in their classroom by demonstrating how IS faculty might begin to introduce more active learning techniques in their teaching as well as by presenting a sample teaching agenda for a class that uses a mix of active and passive learning techniques to engage student learning. Findings: Twenty successful types of active learning exercises in IS courses are presented. Recommendations for Practitioners\t: This paper offers a “how to” resource of successful active learning strategies for IS faculty interested in implementing active learning in the classroom. Recommendation for Researchers: This work provides an example of a systematic literature review as a means to assess successful implementations of active learning in IS. Impact on Society: An updated definition of active learning is presented as well as a meaningful

  20. Balanced and sparse Tamo-Barg codes

    KAUST Repository

    Halbawi, Wael

    2017-08-29

    We construct balanced and sparse generator matrices for Tamo and Barg\\'s Locally Recoverable Codes (LRCs). More specifically, for a cyclic Tamo-Barg code of length n, dimension k and locality r, we show how to deterministically construct a generator matrix where the number of nonzeros in any two columns differs by at most one, and where the weight of every row is d + r - 1, where d is the minimum distance of the code. Since LRCs are designed mainly for distributed storage systems, the results presented in this work provide a computationally balanced and efficient encoding scheme for these codes. The balanced property ensures that the computational effort exerted by any storage node is essentially the same, whilst the sparse property ensures that this effort is minimal. The work presented in this paper extends a similar result previously established for Reed-Solomon (RS) codes, where it is now known that any cyclic RS code possesses a generator matrix that is balanced as described, but is sparsest, meaning that each row has d nonzeros.

  1. Greedy vs. L1 Convex Optimization in Sparse Coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... and action recognition, a comparative study of codes in abnormal event detection is less studied and hence no conclusion is gained on the effect of codes in detecting abnormalities. We constrict our comparison in two types of the above L0-norm solutions: greedy algorithms and convex L1-norm solutions....... Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate their performance...

  2. The Role of Code-Switching in Bilingual Creativity

    Science.gov (United States)

    Kharkhurin, Anatoliy V.; Wei, Li

    2015-01-01

    This study further explores the theme of bilingual creativity with the present focus on code-switching. Specifically, it investigates whether code-switching practice has an impact on creativity. In line with the previous research, selective attention was proposed as a potential cognitive mechanism, which on the one hand would benefit from…

  3. Toward a Code of Conduct for the Presidency

    Science.gov (United States)

    Fleming, J. Christopher

    2012-01-01

    A presidential code of conduct is needed more today than ever before. College and university presidents are being required to do more without the proper training to succeed. Presidents from outside the academy enter academia with normative patterns and codes of conduct that served them well in their previous occupations but now have the potential…

  4. Multimedia distribution using network coding on the iphone platform

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Fitzek, Frank

    2010-01-01

    This paper looks into the implementation details of random linear network coding on the Apple iPhone and iPod Touch mobile platforms for multimedia distribution. Previous implementations of network coding on this platform failed to achieve a throughput which is sufficient to saturate the WLAN...

  5. Interactive Visual Mechanisms for Exploring Source Code Evolution

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian

    2005-01-01

    The Visual Code Navigator (VCN) is an ongoing effort to build a visual environment for interactive visualization of large source code bases. We present two techniques that extend the previous work done on the VCN. We propose an efficient and effective mechanism for specifying and visualizing queries

  6. Writing robust C++ code for critical applications

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **C++** is one of the most **complex**, expressive and powerful languages out there. However, its complexity makes it hard to write **robust** code. When using C++ to code **critical** applications, ensuring **reliability** is one of the key topics. Testing, debugging and profiling are all a major part of this kind of work. In the BE department we use C++ to write a big part of the controls system for beam operation, which implies putting a big focus on system stability and ensuring smooth operation. This talk will try to: - Highlight potential problems when writing C++ code, giving guidelines on writing defensive code that could have avoided such issues - Explain how to avoid common pitfalls (both in writing C++ code and at the debugging & profiling phase) - Showcase some tools and tricks useful to C++ development The attendees' proficiency in C++ should not be a concern. Anyone is free to join, even people that do not know C++, if only to learn the pitfalls a language may have. This may benefit f...

  7. Identifying people with a learning disability: an advanced search for general practice.

    Science.gov (United States)

    Russell, Amy M; Bryant, Louise; House, Allan

    2017-12-01

    People with learning disabilities (LD) have poor physical and mental health when compared with the general population. They are also likely to find it more difficult than others to describe their symptoms adequately. It is therefore harder for healthcare workers to identify the health needs of those with learning disabilities, with the danger of some problems being left unrecognised. Practice registers record only a proportion of those who are eligible, making it difficult to target improvements in their health care. To test a Read Code search supporting the identification of people with a mild-to-moderate learning disability who are not currently on the learning disability register. An observational study in primary care in West Yorkshire. Read Code searches were created to identify individuals with a learning disability not on the LD register; they were field tested and further refined before testing in general practice. Diagnostic codes identified small numbers of individuals who should have been on the LD register. Functional and service use codes often created large numbers of false-positive results. The specific descriptive codes 'Learning difficulties' and 'Referral to learning disability team' needed follow-up review, and then identified some individuals with LD who were not on the register. The Read Code search supported practices to populate their registers and was quick to run and review, making it a viable choice to support register revalidation. However, it did not find large numbers of people eligible for the LD register who were previously unidentified by their practice, suggesting that additional complementary methods are required to support practices to validate their registers. © British Journal of General Practice 2017.

  8. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    lowing function is maximized,. This kind of decoding strategy is called the maximum a posteriori probability (MAP) decoding strategy as it attempts to estimate each symbol of the codeword that ..... gate the effects of packet loss over digital networks. Un- doubtedly other applications will use these codes in the years to come.

  9. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  10. CERN Code of Conduct

    CERN Document Server

    Department, HR

    2010-01-01

    The Code is intended as a guide in helping us, as CERN contributors, to understand how to conduct ourselves, treat others and expect to be treated. It is based around the five core values of the Organization. We should all become familiar with it and try to incorporate it into our daily life at CERN.

  11. Error Correcting Codes

    Indian Academy of Sciences (India)

    focused pictures of Triton, Neptune's largest moon. This great feat was in no small measure due to the fact that the sophisticated communication system on Voyager had an elaborate error correcting scheme built into it. At Jupiter and Saturn, a convolutional code was used to enhance the reliability of transmission, and at ...

  12. Nuclear safety code study

    Energy Technology Data Exchange (ETDEWEB)

    Hu, H.H.; Ford, D.; Le, H.; Park, S.; Cooke, K.L.; Bleakney, T.; Spanier, J.; Wilburn, N.P.; O' Reilly, B.; Carmichael, B.

    1981-01-01

    The objective is to analyze an overpower accident in an LMFBR. A simplified model of the primary coolant loop was developed in order to understand the instabilities encountered with the MELT III and SAS codes. The computer programs were translated for switching to the IBM 4331. Numerical methods were investigated for solving the neutron kinetics equations; the Adams and Gear methods were compared. (DLC)

  13. Student Dress Codes.

    Science.gov (United States)

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  14. Differential pulse code modulation

    Science.gov (United States)

    Herman, C. F. (Inventor)

    1976-01-01

    A differential pulse code modulation (DPCM) encoding and decoding method is described along with an apparatus which is capable of transmission with minimum bandwidth. The apparatus is not affected by data transition density, requires no direct current (DC) response of the transmission link, and suffers from minimal ambiguity in resolution of the digital data.

  15. Error Correcting Codes

    Indian Academy of Sciences (India)

    syndrome is an indicator of underlying disease. Here too, a non zero syndrome is an indication that something has gone wrong during transmission. SERIES I ARTICLE. The first matrix on the left hand side is called the parity check matrix H. Thus every codeword c satisfies the equation o o. HcT = o o. Therefore the code can ...

  16. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  17. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  18. Broadcast Coded Slotted ALOHA

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Brännström, Frederik; Graell i Amat, Alexandre

    2016-01-01

    We propose an uncoordinated medium access control (MAC) protocol, called all-to-all broadcast coded slotted ALOHA (B-CSA) for reliable all-to-all broadcast with strict latency constraints. In B-CSA, each user acts as both transmitter and receiver in a half-duplex mode. The half-duplex mode gives...

  19. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  20. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  1. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  2. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  3. Ptolemy Coding Style

    Science.gov (United States)

    2014-09-05

    because this would combine Ptolemy II with the GPL’d code and thus encumber Ptolemy II with the GPL. Another GNU license is the GNU Library General...permission on the source.eecs.berkeley.edu repositories, then use your local repository. bash-3.2$ svn co svn+ ssh ://source.eecs.berkeley.edu/chess

  4. SCALE Code System

    Energy Technology Data Exchange (ETDEWEB)

    Jessee, Matthew Anderson [ORNL

    2016-04-01

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministic and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE

  5. Code subspaces for LLM geometries

    Science.gov (United States)

    Berenstein, David; Miller, Alexandra

    2018-03-01

    We consider effective field theory around classical background geometries with a gauge theory dual, specifically those in the class of LLM geometries. These are dual to half-BPS states of N= 4 SYM. We find that the language of code subspaces is natural for discussing the set of nearby states, which are built by acting with effective fields on these backgrounds. This work extends our previous work by going beyond the strict infinite N limit. We further discuss how one can extract the topology of the state beyond N→∞ and find that, as before, uncertainty and entanglement entropy calculations provide a useful tool to do so. Finally, we discuss obstructions to writing down a globally defined metric operator. We find that the answer depends on the choice of reference state that one starts with. Therefore, within this setup, there is ambiguity in trying to write an operator that describes the metric globally.

  6. RITA, a promising Monte Carlo code for recoil implantation

    International Nuclear Information System (INIS)

    Desalvo, A.; Rosa, R.

    1982-01-01

    A computer code previously set up to simulate ion penetration in amorphous solids has been extended to handle with recoil phenomena. Preliminary results are compared with existing experimental data. (author)

  7. Accumulate Repeat Accumulate Coded Modulation

    Science.gov (United States)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  8. Causation, constructors and codes.

    Science.gov (United States)

    Hofmeyr, Jan-Hendrik S

    2018-02-01

    Relational biology relies heavily on the enriched understanding of causal entailment that Robert Rosen's formalisation of Aristotle's four causes has made possible, although to date efficient causes and the rehabilitation of final cause have been its main focus. Formal cause has been paid rather scant attention, but, as this paper demonstrates, is crucial to our understanding of many types of processes, not necessarily biological. The graph-theoretic relational diagram of a mapping has played a key role in relational biology, and the first part of the paper is devoted to developing an explicit representation of formal cause in the diagram and how it acts in combination with efficient cause to form a mapping. I then use these representations to show how Von Neumann's universal constructor can be cast into a relational diagram in a way that avoids the logical paradox that Rosen detected in his own representation of the constructor in terms of sets and mappings. One aspect that was absent from both Von Neumann's and Rosen's treatments was the necessity of a code to translate the description (the formal cause) of the automaton to be constructed into the construction process itself. A formal definition of codes in general, and organic codes in particular, allows the relational diagram to be extended so as to capture this translation of formal cause into process. The extended relational diagram is used to exemplify causal entailment in a diverse range of processes, such as enzyme action, construction of automata, communication through the Morse code, and ribosomal polypeptide synthesis through the genetic code. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. GAPCON-THERMAL-3 code description

    Energy Technology Data Exchange (ETDEWEB)

    Lanning, D.D.; Mohr, C.L.; Panisko, F.E.; Stewart, K.B.

    1978-01-01

    GAPCON-3 is a computer program that predicts the thermal and mechanical behavior of an operating fuel rod during its normal lifetime. The code calculates temperatures, dimensions, stresses, and strains for the fuel and the cladding in both the radial and axial directions for each step of the user specified power history. The method of weighted residuals is for the steady state temperature calculation, and is combined with a finite difference approximation of the time derivative for transient conditions. The stress strain analysis employs an iterative axisymmetric finite element procedure that includes plasticity and creep for normal and pellet-clad mechanical interaction loads. GAPCON-3 can solve steady state and operational transient problems. Comparisons of GAPCON-3 predictions to both closed form analytical solutions and actual inpile instrumented fuel rod data have demonstrated the ability of the code to calculate fuel rod behavior. GAPCON-3 features a restart capability and an associated plot package unavailable in previous GAPCON series codes.

  10. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  11. Systematic Design of Space-Time Trellis Codes for Diversity and Coding Advantages

    Directory of Open Access Journals (Sweden)

    Zoltan Safar

    2002-03-01

    Full Text Available The emerging need for high data rate wireless services has raised considerable interest in space-time coding. In this work, we propose a systematic code construction method that jointly considers diversity advantage and coding advantage for an arbitrary number of transmit antennas and any memoryless constellation. Our approach is to directly assign channel symbols to transmit antennas at different states by exploiting the properties of the state transitions in the trellis. The code construction problem is reduced to a combinatorial optimization problem and a computationally efficient suboptimal solution is proposed. The flexibility of the method is demonstrated by designing space-time trellis codes for QPSK, 8PSK, 16PSK, asymmetric QPSK and 4ASK constellations. Space-time code construction for a large number of transmit antennas (6, 8, and 10 is also considered. The simulations show that our design procedure results in codes that outperform the ones constructed by previously existing methods. The achievable performance gain is governed by the distance structure of the chosen constellation.

  12. Recent developments in the CONTAIN-LMR code

    International Nuclear Information System (INIS)

    Murata, K.K.

    1990-01-01

    Through an international collaborative effort, a special version of the CONTAIN code is being developed for integrated mechanistic analysis of the conditions in liquid metal reactor (LMR) containments during severe accidents. The capabilities of the most recent code version, CONTAIN LMR/1B-Mod.1, are discussed. These include new models for the treatment of two condensables, sodium condensation on aerosols, chemical reactions, hygroscopic aerosols, and concrete outgassing. This code version also incorporates all of the previously released LMR model enhancements. The results of an integral demonstration calculation of a sever core-melt accident scenario are given to illustrate the features of this code version. 11 refs., 7 figs., 1 tab

  13. Local stabilizer codes in three dimensions without string logical operators

    OpenAIRE

    Haah, Jeongwan

    2011-01-01

    We suggest concrete models for self-correcting quantum memory by reporting examples of local stabilizer codes in 3D that have no string logical operators. Previously known local stabilizer codes in 3D all have string-like logical operators, which make the codes non-self-correcting. We introduce a notion of "logical string segments" to avoid difficulties in defining one dimensional objects in discrete lattices. We prove that every string-like logical operator of our code can be deformed to a d...

  14. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, however, that national views of good governance reflect different political cultures and institutional heritages. Fourteen national codes of conduct are analyzed. The findings suggest that public values converge and that they match model codes from the United Nations and the European Council as well...... as conceptions of good governance from other international organizations. While values converge, they are balanced and communicated differently, and seem to some extent to be translated into the national cultures. The set of global public values derived from this analysis include public interest, regime dignity...

  15. Suture Coding: A Novel Educational Guide for Suture Patterns.

    Science.gov (United States)

    Gaber, Mohamed; Abdel-Wahed, Ramadan

    2015-01-01

    This study aims to provide a helpful guide to perform tissue suturing successfully using suture coding-a method for identification of suture patterns and techniques by giving full information about the method of application of each pattern using numbers and symbols. Suture coding helps construct an infrastructure for surgical suture science. It facilitates the easy understanding and learning of suturing techniques and patterns as well as detects the relationship between the different patterns. Guide points are fixed on both edges of the wound to act as a guideline to help practice suture pattern techniques. The arrangement is fixed as 1-3-5-7 and a-c-e-g on one side (whether right or left) and as 2-4-6-8 and b-d-f-h on the other side. Needle placement must start from number 1 or letter "a" and continue to follow the code till the end of the stitching. Some rules are created to be adopted for the application of suture coding. A suture trainer containing guide points that simulate the coding process is used to facilitate the learning of the coding method. (120) Is the code of simple interrupted suture pattern; (ab210) is the code of vertical mattress suture pattern, and (013465)²/3 is the code of Cushing suture pattern. (0A1) Is suggested as a surgical suture language that gives the name and type of the suture pattern used to facilitate its identification. All suture patterns known in the world should start with (0), (A), or (1). There is a relationship between 2 or more surgical patterns according to their codes. It can be concluded that every suture pattern has its own code that helps in the identification of its type, structure, and method of application. Combination between numbers and symbols helps in the understanding of suture techniques easily without complication. There are specific relationships that can be identified between different suture patterns. Coding methods facilitate suture patterns learning process. The use of suture coding can be a good

  16. Computer code FIT

    International Nuclear Information System (INIS)

    Rohmann, D.; Koehler, T.

    1987-02-01

    This is a description of the computer code FIT, written in FORTRAN-77 for a PDP 11/34. FIT is an interactive program to decude position, width and intensity of lines of X-ray spectra (max. length of 4K channels). The lines (max. 30 lines per fit) may have Gauss- or Voigt-profile, as well as exponential tails. Spectrum and fit can be displayed on a Tektronix terminal. (orig.) [de

  17. Status of MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  18. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    Why do men continue to fill most of the senior executive positions and seats in the board of directors in Western corporations? Almost everyone agrees that diversity is good, many women are coming down the pipeline, and companies, states and international organizations and institutions have done...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  19. Hydra Code Release

    OpenAIRE

    Couchman, H. M. P.; Pearce, F. R.; Thomas, P. A.

    1996-01-01

    Comment: A new version of the AP3M-SPH code, Hydra, is now available as a tar file from the following sites; http://coho.astro.uwo.ca/pub/hydra/hydra.html , http://star-www.maps.susx.ac.uk/~pat/hydra/hydra.html . The release now also contains a cosmological initial conditions generator, documentation, an installation guide and installation tests. A LaTex version of the documentation is included here

  20. Tokamak simulation code manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Moon Kyoo; Oh, Byung Hoon; Hong, Bong Keun; Lee, Kwang Won [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-01-01

    The method to use TSC (Tokamak Simulation Code) developed by Princeton plasma physics laboratory is illustrated. In KT-2 tokamak, time dependent simulation of axisymmetric toroidal plasma and vertical stability have to be taken into account in design phase using TSC. In this report physical modelling of TSC are described and examples of application in JAERI and SERI are illustrated, which will be useful when TSC is installed KAERI computer system. (Author) 15 refs., 6 figs., 3 tabs.

  1. Tokamak simulation code manual

    International Nuclear Information System (INIS)

    Chung, Moon Kyoo; Oh, Byung Hoon; Hong, Bong Keun; Lee, Kwang Won

    1995-01-01

    The method to use TSC (Tokamak Simulation Code) developed by Princeton plasma physics laboratory is illustrated. In KT-2 tokamak, time dependent simulation of axisymmetric toroidal plasma and vertical stability have to be taken into account in design phase using TSC. In this report physical modelling of TSC are described and examples of application in JAERI and SERI are illustrated, which will be useful when TSC is installed KAERI computer system. (Author) 15 refs., 6 figs., 3 tabs

  2. Learning scikit-learn machine learning in Python

    CERN Document Server

    Garreta, Raúl

    2013-01-01

    The book adopts a tutorial-based approach to introduce the user to Scikit-learn.If you are a programmer who wants to explore machine learning and data-based methods to build intelligent applications and enhance your programming skills, this the book for you. No previous experience with machine-learning algorithms is required.

  3. Interval Coded Scoring: a toolbox for interpretable scoring systems

    Directory of Open Access Journals (Sweden)

    Lieven Billiet

    2018-04-01

    Full Text Available Over the last decades, clinical decision support systems have been gaining importance. They help clinicians to make effective use of the overload of available information to obtain correct diagnoses and appropriate treatments. However, their power often comes at the cost of a black box model which cannot be interpreted easily. This interpretability is of paramount importance in a medical setting with regard to trust and (legal responsibility. In contrast, existing medical scoring systems are easy to understand and use, but they are often a simplified rule-of-thumb summary of previous medical experience rather than a well-founded system based on available data. Interval Coded Scoring (ICS connects these two approaches, exploiting the power of sparse optimization to derive scoring systems from training data. The presented toolbox interface makes this theory easily applicable to both small and large datasets. It contains two possible problem formulations based on linear programming or elastic net. Both allow to construct a model for a binary classification problem and establish risk profiles that can be used for future diagnosis. All of this requires only a few lines of code. ICS differs from standard machine learning through its model consisting of interpretable main effects and interactions. Furthermore, insertion of expert knowledge is possible because the training can be semi-automatic. This allows end users to make a trade-off between complexity and performance based on cross-validation results and expert knowledge. Additionally, the toolbox offers an accessible way to assess classification performance via accuracy and the ROC curve, whereas the calibration of the risk profile can be evaluated via a calibration curve. Finally, the colour-coded model visualization has particular appeal if one wants to apply ICS manually on new observations, as well as for validation by experts in the specific application domains. The validity and applicability

  4. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  5. Bar coded retroreflective target

    Science.gov (United States)

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  6. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  7. Development of PARASOL code

    Energy Technology Data Exchange (ETDEWEB)

    Hosokawa, Masanari [Research Organization for Information Science and Technology, Tokai, Ibaraki (Japan); Takizuka, Tomonori [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2000-05-01

    The divertor is expected to play key roles in tokamak reactors, such as ITER, for the heat removal, ash exhaust, and impurity shielding. Its performance is being predicted by using comprehensive simulation codes with the fluid model. In the fluid model for scrape-off layer (SOL) and divertor plasmas, various physics models are introduced. Kinetic approach is required to examine the validity of such physics models. One of the most powerful kinetic models is the particle simulation. Therefore a particle code PARASOL has been developed, and is being used for the simulation study of SOL and divertor plasmas. The PARASOL code treats the plasma bounded by two divertor plates, in which motions of ions and electrons are traced by using a electrostatic PIC method. Effects of Coulomb collisions are simulated by using a Monte-Carlo=method binary collision model. Motions of neutral particles are traced simultaneously with charged particles. In this report, we describe the physics model of PARASOL, the numerical methods, the configuration of the program, input parameters, output formats, samples of simulation results, the parallel computing method. The efficiency of the parallel computing with Paragon XP/S15-256 is demonstrated. (author)

  8. A novel neutron energy spectrum unfolding code using particle swarm optimization

    International Nuclear Information System (INIS)

    Shahabinejad, H.; Sohrabpour, M.

    2017-01-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code. - Highlights: • Introducing a novel method for neutron spectrum unfolding. • Implementation of a particle swarm optimization code for neutron unfolding. • Comparing results of the PSO code with those of recently published TGASU code. • Match results of the PSO code with those of TGASU code. • Greater convergence rate of implemented PSO code than TGASU code.

  9. Fast convolutional sparse coding using matrix inversion lemma

    Czech Academy of Sciences Publication Activity Database

    Šorel, Michal; Šroubek, Filip

    2016-01-01

    Roč. 55, č. 1 (2016), s. 44-51 ISSN 1051-2004 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Convolutional sparse coding * Feature learning * Deconvolution networks * Shift-invariant sparse coding Subject RIV: JD - Computer Applications, Robotics Impact factor: 2.337, year: 2016 http://library.utia.cas.cz/separaty/2016/ZOI/sorel-0459332.pdf

  10. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  11. On Some Ternary LCD Codes

    OpenAIRE

    Darkunde, Nitin S.; Patil, Arunkumar R.

    2018-01-01

    The main aim of this paper is to study $LCD$ codes. Linear code with complementary dual($LCD$) are those codes which have their intersection with their dual code as $\\{0\\}$. In this paper we will give rather alternative proof of Massey's theorem\\cite{8}, which is one of the most important characterization of $LCD$ codes. Let $LCD[n,k]_3$ denote the maximum of possible values of $d$ among $[n,k,d]$ ternary $LCD$ codes. In \\cite{4}, authors have given upper bound on $LCD[n,k]_2$ and extended th...

  12. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  13. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB

  14. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  15. MELCOR Accident Consequence Code System (MACCS)

    International Nuclear Information System (INIS)

    Chanin, D.I.; Sprung, J.L.; Ritchie, L.T.; Jow, Hong-Nian

    1990-02-01

    This report describes the MACCS computer code. The purpose of this code is to simulate the impact of severe accidents at nuclear power plants on the surrounding environment. MACCS has been developed for the US Nuclear Regulatory Commission to replace the previous CRAC2 code, and it incorporates many improvements in modeling flexibility in comparison to CRAC2. The principal phenomena considered in MACCS are atmospheric transport, mitigative actions based on dose projection, dose accumulation by a number of pathways including food and water ingestion, early and latent health effects, and economic costs. The MACCS code can be used for a variety of applications. These include (1) probabilistic risk assessment (PRA) of nuclear power plants and other nuclear facilities, (2) sensitivity studies to gain a better understanding of the parameters important to PRA, and (3) cost-benefit analysis. This report is composed of three volumes. This document, Volume 1, the Users's Guide, describes the input data requirements of the MACCS code and provides directions for its use as illustrated by three sample problems

  16. The best bits in an iris code.

    Science.gov (United States)

    Hollingsworth, Karen P; Bowyer, Kevin W; Flynn, Patrick J

    2009-06-01

    Iris biometric systems apply filters to iris images to extract information about iris texture. Daugman's approach maps the filter output to a binary iris code. The fractional Hamming distance between two iris codes is computed and decisions about the identity of a person are based on the computed distance. The fractional Hamming distance weights all bits in an iris code equally. However, not all the bits in an iris code are equally useful. Our research is the first to present experiments documenting that some bits are more consistent than others. Different regions of the iris are compared to evaluate their relative consistency, and contrary to some previous research, we find that the middle bands of the iris are more consistent than the inner bands. The inconsistent-bit phenomenon is evident across genders and different filter types. Possible causes of inconsistencies, such as segmentation, alignment issues, and different filters are investigated. The inconsistencies are largely due to the coarse quantization of the phase response. Masking iris code bits corresponding to complex filter responses near the axes of the complex plane improves the separation between the match and nonmatch Hamming distance distributions.

  17. Construction of new quantum MDS codes derived from constacyclic codes

    Science.gov (United States)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  18. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Thommesen, Christian; Høholdt, Tom

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved Reed/Solomon codes, which allows close to errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes. (NK) N-K......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved Reed/Solomon codes, which allows close to errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes. (NK) N-K...

  19. Decoding of concatenated codes with interleaved outer codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom; Thommesen, Christian

    2004-01-01

    Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes.......Recently Bleichenbacher et al. proposed a decoding algorithm for interleaved (N, K) Reed-Solomon codes, which allows close to N-K errors to be corrected in many cases. We discuss the application of this decoding algorithm to concatenated codes....

  20. Performance analysis of multiple interference suppression over asynchronous/synchronous optical code-division multiple-access system based on complementary/prime/shifted coding scheme

    Science.gov (United States)

    Nieh, Ta-Chun; Yang, Chao-Chin; Huang, Jen-Fa

    2011-08-01

    A complete complementary/prime/shifted prime (CPS) code family for the optical code-division multiple-access (OCDMA) system is proposed. Based on the ability of complete complementary (CC) code, the multiple-access interference (MAI) can be suppressed and eliminated via spectral amplitude coding (SAC) OCDMA system under asynchronous/synchronous transmission. By utilizing the shifted prime (SP) code in the SAC scheme, the hardware implementation of encoder/decoder can be simplified with a reduced number of optical components, such as arrayed waveguide grating (AWG) and fiber Bragg grating (FBG). This system has a superior performance as compared to previous bipolar-bipolar coding OCDMA systems.

  1. Fast H.264/AVC FRExt intra coding using belief propagation.

    Science.gov (United States)

    Milani, Simone

    2011-01-01

    In the H.264/AVC FRExt coder, the coding performance of Intra coding significantly overcomes the previous still image coding standards, like JPEG2000, thanks to a massive use of spatial prediction. Unfortunately, the adoption of an extensive set of predictors induces a significant increase of the computational complexity required by the rate-distortion optimization routine. The paper presents a complexity reduction strategy that aims at reducing the computational load of the Intra coding with a small loss in the compression performance. The proposed algorithm relies on selecting a reduced set of prediction modes according to their probabilities, which are estimated adopting a belief-propagation procedure. Experimental results show that the proposed method permits saving up to 60 % of the coding time required by an exhaustive rate-distortion optimization method with a negligible loss in performance. Moreover, it permits an accurate control of the computational complexity unlike other methods where the computational complexity depends upon the coded sequence.

  2. A novel neutron energy spectrum unfolding code using particle swarm optimization

    Science.gov (United States)

    Shahabinejad, H.; Sohrabpour, M.

    2017-07-01

    A novel neutron Spectrum Deconvolution using Particle Swarm Optimization (SDPSO) code has been developed to unfold the neutron spectrum from a pulse height distribution and a response matrix. The Particle Swarm Optimization (PSO) imitates the bird flocks social behavior to solve complex optimization problems. The results of the SDPSO code have been compared with those of the standard spectra and recently published Two-steps Genetic Algorithm Spectrum Unfolding (TGASU) code. The TGASU code have been previously compared with the other codes such as MAXED, GRAVEL, FERDOR and GAMCD and shown to be more accurate than the previous codes. The results of the SDPSO code have been demonstrated to match well with those of the TGASU code for both under determined and over-determined problems. In addition the SDPSO has been shown to be nearly two times faster than the TGASU code.

  3. low bit rate video coding low bit rate video coding

    African Journals Online (AJOL)

    eobe

    Variable length bit rate (VLBR) ariable length bit rate (VLBR) ariable length bit rate (VLBR) broadly encompasses video coding which broadly encompasses video coding which broadly encompasses video coding which mandates a temporal frequency of 10 mandates a temporal frequency of 10 frames per frames per ...

  4. Code Flows : Visualizing Structural Evolution of Source Code

    NARCIS (Netherlands)

    Telea, Alexandru; Auber, David

    2008-01-01

    Understanding detailed changes done to source code is of great importance in software maintenance. We present Code Flows, a method to visualize the evolution of source code geared to the understanding of fine and mid-level scale changes across several file versions. We enhance an existing visual

  5. Distributed Video Coding for Multiview and Video-plus-depth Coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo

    The interest in Distributed Video Coding (DVC) systems has grown considerably in the academic world in recent years. With DVC the correlation between frames is exploited at the decoder (joint decoding). The encoder codes the frame independently, performing relatively simple operations. Therefore...... and Multiview scenarios. Furthermore on-line correlation noise models are proposed. On-line models are needed to enable the codec to be used in realistic scenarios. Focus is put on developing and investigating robust fusion techniques, able to correctly fuse various Sis. Learning algorithms for improving...... of the need to code the MVs. On the other hand DVC can exploit OF because the Motion Estimation (ME) is only performed at the decoder. In this thesis it is proposed to use OF for joint disparity and motion calculation in M-DVC and for joint motion estimation in texture and depth frames in video...

  6. Difference-Huffman Coding of Multidimensional Databases

    OpenAIRE

    Szépkúti, István

    2011-01-01

    A new compression method called difference-Huffman coding (DHC) is introduced in this paper. It is verified empirically that DHC results in a smaller multidimensional physical representation than those for other previously published techniques (single count header compression, logical position compression, base-offset compression and difference sequence compression). The article examines how caching influences the expected retrieval time of the multidimensional and table representations of re...

  7. Code domain steganography in video tracks

    Science.gov (United States)

    Rymaszewski, Sławomir

    2008-01-01

    This article is dealing with a practical method of hiding secret information in video stream. Method is dedicated for MPEG-2 stream. The algorithm takes to consider not only MPEG video coding scheme described in standard but also bits PES-packets encapsulation in MPEG-2 Program Stream (PS). This modification give higher capacity and more effective bit rate control for output stream than previously proposed methods.

  8. Coding with Blockly

    CERN Document Server

    Lovett, Amber

    2017-01-01

    "Blockly is a fun, graphical programming language designed to get kids interested in creating their own computer programs. Through simple text written to foster creativity and problem solving, students will the art of innovation. Large, colorful images show students how to complete activities. Additional tools, including a glossary and an index, help students learn new vocabulary and locate information."-- Provided by publisher.

  9. Code-labelling

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid; Brynskov, Martin

    reasons. It seems the exercise invokes an assimilation of student's existing cognitive schemata and supports a deep-learning experience. The exercise is an invitation to other teachers to create further iterations to improve their own teaching. It also seeks to enrich the portfolio of teaching activities...

  10. Case studies in Gaussian process modelling of computer codes

    International Nuclear Information System (INIS)

    Kennedy, Marc C.; Anderson, Clive W.; Conti, Stefano; O'Hagan, Anthony

    2006-01-01

    In this paper we present a number of recent applications in which an emulator of a computer code is created using a Gaussian process model. Tools are then applied to the emulator to perform sensitivity analysis and uncertainty analysis. Sensitivity analysis is used both as an aid to model improvement and as a guide to how much the output uncertainty might be reduced by learning about specific inputs. Uncertainty analysis allows us to reflect output uncertainty due to unknown input parameters, when the finished code is used for prediction. The computer codes themselves are currently being developed within the UK Centre for Terrestrial Carbon Dynamics

  11. The Tap code - a code similar to Morse code for communication by tapping

    OpenAIRE

    Rafler, Stephan

    2013-01-01

    A code is presented for fast, easy and efficient communication over channels that allow only two signal types: a single sound (e.g. a knock), or no sound (i.e. silence). This is a true binary code while Morse code is a ternary code and does not work in such situations. Thus the presented code is more universal than Morse and can be used in much more situations. Additionally it is very tolerant to variations in signal strength or duration. The paper contains various ways in which the code can ...

  12. Computer codes validation for conditions of core voiding

    International Nuclear Information System (INIS)

    Delja, A.; Hawley, P.

    2011-01-01

    Void generation during a Loss of Coolant Accident (LOCA) in a core of a CANDU reactor is of specific importance because of its strong coupling with reactor neutronics. The use of dynamic behaviour and computer code capability to predict void generation accurately in the temporal and spatial domain of the reactor core is fundamental for the determination of CANDU safety. The Canadian industry has used the RD-14M test facilities for its code validation. The validation exercises for the Canadian computer codes TUF and CATHENA were performed some years ago. Recently, the CNSC has gained access to the USNRC computer code TRACE. This has provided an opportunity to explore the use of this code in CANDU related applications. As a part of regulatory assessment and resolving identified Generic Issues (GI), and in an effort to build independent thermal hydraulic computer codes assessment capability within the CNSC, preliminary validation exercises were performed using the TRACE computer code for an evaluation of the void generation phenomena. The paper presents a preliminary assessment of the TRACE computer code for an RD-14M channel voiding test. It is also a validation exercise of void generation for the TRACE computer code. The accuracy of the obtained results is discussed and compared with previous validation assessments that were done using the CATHENA and TUF codes. (author)

  13. Deep learning with Python

    CERN Document Server

    Chollet, Francois

    2018-01-01

    DESCRIPTION Deep learning is applicable to a widening range of artificial intelligence problems, such as image classification, speech recognition, text classification, question answering, text-to-speech, and optical character recognition. Deep Learning with Python is structured around a series of practical code examples that illustrate each new concept introduced and demonstrate best practices. By the time you reach the end of this book, you will have become a Keras expert and will be able to apply deep learning in your own projects. KEY FEATURES • Practical code examples • In-depth introduction to Keras • Teaches the difference between Deep Learning and AI ABOUT THE TECHNOLOGY Deep learning is the technology behind photo tagging systems at Facebook and Google, self-driving cars, speech recognition systems on your smartphone, and much more. AUTHOR BIO Francois Chollet is the author of Keras, one of the most widely used libraries for deep learning in Python. He has been working with deep neural ...

  14. Quick response codes in Orthodontics

    Directory of Open Access Journals (Sweden)

    Moidin Shakil

    2015-01-01

    Full Text Available Quick response (QR code codes are two-dimensional barcodes, which encodes for a large amount of information. QR codes in Orthodontics are an innovative approach in which patient details, radiographic interpretation, and treatment plan can be encoded. Implementing QR code in Orthodontics will save time, reduces paperwork, and minimizes manual efforts in storage and retrieval of patient information during subsequent stages of treatment.

  15. Bar codes for nuclear safeguards

    International Nuclear Information System (INIS)

    Keswani, A.N.; Bieber, A.M. Jr.

    1983-01-01

    Bar codes similar to those used in supermarkets can be used to reduce the effort and cost of collecting nuclear materials accountability data. A wide range of equipment is now commercially available for printing and reading bar-coded information. Several examples of each of the major types of commercially available equipment are given, and considerations are discussed both for planning systems using bar codes and for choosing suitable bar code equipment

  16. The CORSYS neutronics code system

    International Nuclear Information System (INIS)

    Caner, M.; Krumbein, A.D.; Saphier, D.; Shapira, M.

    1994-01-01

    The purpose of this work is to assemble a code package for LWR core physics including coupled neutronics, burnup and thermal hydraulics. The CORSYS system is built around the cell code WIMS (for group microscopic cross section calculations) and 3-dimension diffusion code CITATION (for burnup and fuel management). We are implementing such a system on an IBM RS-6000 workstation. The code was rested with a simplified model of the Zion Unit 2 PWR. (authors). 6 refs., 8 figs., 1 tabs

  17. The path of code linting

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Join the path of code linting and discover how it can help you reach higher levels of programming enlightenment. Today we will cover how to embrace code linters to offload cognitive strain on preserving style standards in your code base as well as avoiding error-prone constructs. Additionally, I will show you the journey ahead for integrating several code linters in the programming tools your already use with very little effort.

  18. Learning "While" Working: Success Stories on Workplace Learning in Europe

    Science.gov (United States)

    Lardinois, Rocio

    2011-01-01

    Cedefop's report "Learning while working: success stories on workplace learning in Europe" presents an overview of key trends in adult learning in the workplace. It takes stock of previous research carried out by Cedefop between 2003 and 2010 on key topics for adult learning: governance and the learning regions; social partner roles in…

  19. The FLIC conversion codes

    International Nuclear Information System (INIS)

    Basher, J.C.

    1965-05-01

    This report describes the FORTRAN programmes, FLIC 1 and FLIC 2. These programmes convert programmes coded in one dialect of FORTRAN to another dialect of the same language. FLIC 1 is a general pattern recognition and replacement programme whereas FLIC 2 contains extensions directed towards the conversion of FORTRAN II and S2 programmes to EGTRAN 1 - the dialect now in use on the Winfrith KDF9. FII or S2 statements are replaced where possible by their E1 equivalents; other statements which may need changing are flagged. (author)

  20. Physical Layer Network Coding

    OpenAIRE

    Shengli, Zhang; Liew, Soung-Chang; Lam, Patrick P. K.

    2007-01-01

    A main distinguishing feature of a wireless network compared with a wired network is its broadcast nature, in which the signal transmitted by a node may reach several other nodes, and a node may receive signals from several other nodes simultaneously. Rather than a blessing, this feature is treated more as an interference-inducing nuisance in most wireless networks today (e.g., IEEE 802.11). This paper shows that the concept of network coding can be applied at the physical layer to turn the b...

  1. Code Generation with Templates

    CERN Document Server

    Arnoldus, Jeroen; Serebrenik, A

    2012-01-01

    Templates are used to generate all kinds of text, including computer code. The last decade, the use of templates gained a lot of popularity due to the increase of dynamic web applications. Templates are a tool for programmers, and implementations of template engines are most times based on practical experience rather than based on a theoretical background. This book reveals the mathematical background of templates and shows interesting findings for improving the practical use of templates. First, a framework to determine the necessary computational power for the template metalanguage is presen

  2. Deciphering Neural Codes of Memory during Sleep.

    Science.gov (United States)

    Chen, Zhe; Wilson, Matthew A

    2017-05-01

    Memories of experiences are stored in the cerebral cortex. Sleep is critical for the consolidation of hippocampal memory of wake experiences into the neocortex. Understanding representations of neural codes of hippocampal-neocortical networks during sleep would reveal important circuit mechanisms in memory consolidation and provide novel insights into memory and dreams. Although sleep-associated ensemble spike activity has been investigated, identifying the content of memory in sleep remains challenging. Here we revisit important experimental findings on sleep-associated memory (i.e., neural activity patterns in sleep that reflect memory processing) and review computational approaches to the analysis of sleep-associated neural codes (SANCs). We focus on two analysis paradigms for sleep-associated memory and propose a new unsupervised learning framework ('memory first, meaning later') for unbiased assessment of SANCs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Cinder creative coding cookbook

    CERN Document Server

    Madeira, Rui

    2013-01-01

    Full of easytofollow recipes and images that will teach powerful techniques and algorithms, building from basic projects to challenging applications. This book is for artists, designers, and programmers who have previous knowledge of C++, but not necessarily of Cinder.

  4. The HTM Spatial Pooler—A Neocortical Algorithm for Online Sparse Distributed Coding

    Directory of Open Access Journals (Sweden)

    Yuwei Cui

    2017-11-01

    Full Text Available Hierarchical temporal memory (HTM provides a theoretical framework that models several key computational principles of the neocortex. In this paper, we analyze an important component of HTM, the HTM spatial pooler (SP. The SP models how neurons learn feedforward connections and form efficient representations of the input. It converts arbitrary binary input patterns into sparse distributed representations (SDRs using a combination of competitive Hebbian learning rules and homeostatic excitability control. We describe a number of key properties of the SP, including fast adaptation to changing input statistics, improved noise robustness through learning, efficient use of cells, and robustness to cell death. In order to quantify these properties we develop a set of metrics that can be directly computed from the SP outputs. We show how the properties are met using these metrics and targeted artificial simulations. We then demonstrate the value of the SP in a complete end-to-end real-world HTM system. We discuss the relationship with neuroscience and previous studies of sparse coding. The HTM spatial pooler represents a neurally inspired algorithm for learning sparse representations from noisy data streams in an online fashion.

  5. Order functions and evaluation codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pellikaan, Ruud; van Lint, Jack

    1997-01-01

    Based on the notion of an order function we construct and determine the parameters of a class of error-correcting evaluation codes. This class includes the one-point algebraic geometry codes as wella s the generalized Reed-Muller codes and the parameters are detremined without using the heavy...... machinery of algebraic geometry....

  6. Network Coding Over The 232

    DEFF Research Database (Denmark)

    Pedersen, Morten Videbæk; Heide, Janus; Vingelmann, Peter

    2013-01-01

    Creating efficient finite field implementations has been an active research topic for several decades. Many appli- cations in areas such as cryptography, signal processing, erasure coding and now also network coding depend on this research to deliver satisfactory performance. In this paper we...... will be useful in many network coding applications where large field sizes are required....

  7. Sub-Transport Layer Coding

    DEFF Research Database (Denmark)

    Hansen, Jonas; Krigslund, Jeppe; Roetter, Daniel Enrique Lucani

    2014-01-01

    oblivious to the congestion control algorithms of the utilised transport layer protocol. Although our coding shim is indifferent towards the transport layer protocol, we focus on the performance of TCP when ran on top of our proposed coding mechanism due to its widespread use. The coding shim provides gains...

  8. Strongly-MDS convolutional codes

    NARCIS (Netherlands)

    Gluesing-Luerssen, H; Rosenthal, J; Smarandache, R

    Maximum-distance separable (MDS) convolutional codes have the property that their free distance is maximal among all codes of the same rate and the same degree. In this paper, a class of MDS convolutional codes is introduced whose column distances reach the generalized Singleton bound at the

  9. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  10. Authorship Attribution of Source Code

    Science.gov (United States)

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  11. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Burr Alister

    2009-01-01

    Full Text Available Abstract This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are and . The performances of both systems with high ( and low ( BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  12. Separate Turbo Code and Single Turbo Code Adaptive OFDM Transmissions

    Directory of Open Access Journals (Sweden)

    Lei Ye

    2009-01-01

    Full Text Available This paper discusses the application of adaptive modulation and adaptive rate turbo coding to orthogonal frequency-division multiplexing (OFDM, to increase throughput on the time and frequency selective channel. The adaptive turbo code scheme is based on a subband adaptive method, and compares two adaptive systems: a conventional approach where a separate turbo code is used for each subband, and a single turbo code adaptive system which uses a single turbo code over all subbands. Five modulation schemes (BPSK, QPSK, 8AMPM, 16QAM, and 64QAM are employed and turbo code rates considered are 1/2 and 1/3. The performances of both systems with high (10−2 and low (10−4 BER targets are compared. Simulation results for throughput and BER show that the single turbo code adaptive system provides a significant improvement.

  13. Quantum Codes From Cyclic Codes Over The Ring R 2

    International Nuclear Information System (INIS)

    Altinel, Alev; Güzeltepe, Murat

    2016-01-01

    Let R 2 denotes the ring F 2 + μF 2 + υ 2 + μυ F 2 + wF 2 + μwF 2 + υwF 2 + μυwF 2 . In this study, we construct quantum codes from cyclic codes over the ring R 2 , for arbitrary length n, with the restrictions μ 2 = 0, υ 2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R 2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R 2 and we give an example of quantum error-correcting codes form cyclic codes over R 2 . (paper)

  14. ClinicalCodes: an online clinical codes repository to improve the validity and reproducibility of research using electronic medical records.

    Directory of Open Access Journals (Sweden)

    David A Springate

    Full Text Available Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs. If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1% were accompanied by a full set of published clinical codes and 32 (8.6% stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.

  15. Genetic code for sine

    Science.gov (United States)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  16. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  17. Continuous Non-malleable Codes

    DEFF Research Database (Denmark)

    Faust, Sebastian; Mukherjee, Pratyay; Nielsen, Jesper Buus

    2014-01-01

    Non-malleable codes are a natural relaxation of error correcting/ detecting codes that have useful applications in the context of tamper resilient cryptography. Informally, a code is non-malleable if an adversary trying to tamper with an encoding of a given message can only leave it unchanged or ...... applications of non-malleable codes in this setting required to perfectly erase the entire memory after each execution and required the adversary to be restricted in memory. We show that continuous non-malleable codes avoid these restrictions....

  18. The Art of Readable Code

    CERN Document Server

    Boswell, Dustin

    2011-01-01

    As programmers, we've all seen source code that's so ugly and buggy it makes our brain ache. Over the past five years, authors Dustin Boswell and Trevor Foucher have analyzed hundreds of examples of "bad code" (much of it their own) to determine why they're bad and how they could be improved. Their conclusion? You need to write code that minimizes the time it would take someone else to understand it-even if that someone else is you. This book focuses on basic principles and practical techniques you can apply every time you write code. Using easy-to-digest code examples from different languag

  19. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  20. Introduction of SCIENCE code package

    International Nuclear Information System (INIS)

    Lu Haoliang; Li Jinggang; Zhu Ya'nan; Bai Ning

    2012-01-01

    The SCIENCE code package is a set of neutronics tools based on 2D assembly calculations and 3D core calculations. It is made up of APOLLO2F, SMART and SQUALE and used to perform the nuclear design and loading pattern analysis for the reactors on operation or under construction of China Guangdong Nuclear Power Group. The purpose of paper is to briefly present the physical and numerical models used in each computation codes of the SCIENCE code pack age, including the description of the general structure of the code package, the coupling relationship of APOLLO2-F transport lattice code and SMART core nodal code, and the SQUALE code used for processing the core maps. (authors)

  1. Learning optimized features for hierarchical models of invariant object recognition.

    Science.gov (United States)

    Wersing, Heiko; Körner, Edgar

    2003-07-01

    There is an ongoing debate over the capabilities of hierarchical neural feedforward architectures for performing real-world invariant object recognition. Although a variety of hierarchical models exists, appropriate supervised and unsupervised learning methods are still an issue of intense research. We propose a feedforward model for recognition that shares components like weight sharing, pooling stages, and competitive nonlinearities with earlier approaches but focuses on new methods for learning optimal feature-detecting cells in intermediate stages of the hierarchical network. We show that principles of sparse coding, which were previously mostly applied to the initial feature detection stages, can also be employed to obtain optimized intermediate complex features. We suggest a new approach to optimize the learning of sparse features under the constraints of a weight-sharing or convolutional architecture that uses pooling operations to achieve gradual invariance in the feature hierarchy. The approach explicitly enforces symmetry constraints like translation invariance on the feature set. This leads to a dimension reduction in the search space of optimal features and allows determining more efficiently the basis representatives, which achieve a sparse decomposition of the input. We analyze the quality of the learned feature representation by investigating the recognition performance of the resulting hierarchical network on object and face databases. We show that a hierarchy with features learned on a single object data set can also be applied to face recognition without parameter changes and is competitive with other recent machine learning recognition approaches. To investigate the effect of the interplay between sparse coding and processing nonlinearities, we also consider alternative feedforward pooling nonlinearities such as presynaptic maximum selection and sum-of-squares integration. The comparison shows that a combination of strong competitive

  2. Galois LCD Codes over Finite Fields

    OpenAIRE

    Liu, Xiusheng; Fan, Yun; Liu, Hualu

    2017-01-01

    In this paper, we study the complementary dual codes in more general setting (which are called Galois LCD codes) by a uniform method. A necessary and sufficient condition for linear codes to be Galois LCD codes is determined, and constacyclic codes to be Galois LCD codes are characterized. Some illustrative examples which constacyclic codes are Galois LCD MDS codes are provided as well. In particular, we study Hermitian LCD constacyclic codes. Finally, we present a construction of a class of ...

  3. GOC: General Orbit Code

    International Nuclear Information System (INIS)

    Maddox, L.B.; McNeilly, G.S.

    1979-08-01

    GOC (General Orbit Code) is a versatile program which will perform a variety of calculations relevant to isochronous cyclotron design studies. In addition to the usual calculations of interest (e.g., equilibrium and accelerated orbits, focusing frequencies, field isochronization, etc.), GOC has a number of options to calculate injections with a charge change. GOC provides both printed and plotted output, and will follow groups of particles to allow determination of finite-beam properties. An interactive PDP-10 program called GIP, which prepares input data for GOC, is available. GIP is a very easy and convenient way to prepare complicated input data for GOC. Enclosed with this report are several microfiche containing source listings of GOC and other related routines and the printed output from a multiple-option GOC run

  4. Code des baux 2018

    CERN Document Server

    Vial-Pedroletti, Béatrice; Kendérian, Fabien; Chavance, Emmanuelle; Coutan-Lapalus, Christelle

    2017-01-01

    Le code des baux 2018 vous offre un contenu extrêmement pratique, fiable et à jour au 1er août 2017. Cette 16e édition intègre notamment : le décret du 27 juillet 2017 relatif à l’évolution de certains loyers dans le cadre d’une nouvelle location ou d’un renouvellement de bail, pris en application de l’article 18 de la loi n° 89-462 du 6 juillet 1989 ; la loi du 27 janvier 2017 relative à l’égalité et à la citoyenneté ; la loi du 9 décembre 2016 relative à la transparence, à la lutte contre la corruption et à la modernisation de la vie économique ; la loi du 18 novembre 2016 de modernisation de la justice du xxie siècle

  5. 1994 Building energy codes and standards workshops: Summary and documentation

    Energy Technology Data Exchange (ETDEWEB)

    Sandahl, L.J.; Shankle, D.L.

    1994-09-01

    During the spring of 1994, Pacific Northwest Laboratory (PNL), on behalf of the U.S. Department of Energy (DOE) Office of Codes and Standards, conducted five two-day Regional Building Energy Codes and Standards workshops across the United States. Workshops were held in Chicago, Philadelphia, Atlanta, Dallas, and Denver. The workshops were designed to benefit state-level officials including staff of building code commissions, energy offices, public utility commissions, and others involved with adopting/updating, implementing, and enforcing state building codes in their states. The workshops provided an opportunity for state and other officials to learn more about the Energy Policy Act of 1992 (EPAct) requirements for residential and commercial building energy codes, the Climate Change Action Plan, the role of the U.S. Department of Energy and the Building Energy Standards Program at Pacific Northwest Laboratory, the commercial and residential codes and standards, the Home Energy Rating Systems (HERS), Energy Efficient Mortgages (EEM), training issues, and other topics related to the development, adoption, implementation, and enforcement of building energy codes. In addition to receiving information on the above topics, workshop participants were also encouraged to inform DOE of their needs, particularly with regard to implementing building energy codes, enhancing current implementation efforts, and building on training efforts already in place. This paper documents the workshop findings and workshop planning and follow-up processes.

  6. Using Quick Response Codes in the Classroom: Quality Outcomes.

    Science.gov (United States)

    Zurmehly, Joyce; Adams, Kellie

    2017-10-01

    With smart device technology emerging, educators are challenged with redesigning teaching strategies using technology to allow students to participate dynamically and provide immediate answers. To facilitate integration of technology and to actively engage students, quick response codes were included in a medical surgical lecture. Quick response codes are two-dimensional square patterns that enable the coding or storage of more than 7000 characters that can be accessed via a quick response code scanning application. The aim of this quasi-experimental study was to explore quick response code use in a lecture and measure students' satisfaction (met expectations, increased interest, helped understand, and provided practice and prompt feedback) and engagement (liked most, liked least, wanted changed, and kept involved), assessed using an investigator-developed instrument. Although there was no statistically significant correlation of quick response use to examination scores, satisfaction scores were high, and there was a small yet positive association between how students perceived their learning with quick response codes and overall examination scores. Furthermore, on open-ended survey questions, students responded that they were satisfied with the use of quick response codes, appreciated the immediate feedback, and planned to use them in the clinical setting. Quick response codes offer a way to integrate technology into the classroom to provide students with instant positive feedback.

  7. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  8. High-Fidelity Coding with Correlated Neurons

    Science.gov (United States)

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  9. Improvement of Parallel Algorithm for MATRA Code

    International Nuclear Information System (INIS)

    Kim, Seong-Jin; Seo, Kyong-Won; Kwon, Hyouk; Hwang, Dae-Hyun

    2014-01-01

    The feasibility study to parallelize the MATRA code was conducted in KAERI early this year. As a result, a parallel algorithm for the MATRA code has been developed to decrease a considerably required computing time to solve a bigsize problem such as a whole core pin-by-pin problem of a general PWR reactor and to improve an overall performance of the multi-physics coupling calculations. It was shown that the performance of the MATRA code was greatly improved by implementing the parallel algorithm using MPI communication. For problems of a 1/8 core and whole core for SMART reactor, a speedup was evaluated as about 10 when the numbers of used processor were 25. However, it was also shown that the performance deteriorated as the axial node number increased. In this paper, the procedure of a communication between processors is optimized to improve the previous parallel algorithm.. To improve the performance deterioration of the parallelized MATRA code, the communication algorithm between processors was newly presented. It was shown that the speedup was improved and stable regardless of the axial node number

  10. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    Science.gov (United States)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  11. Selection of Code and Interleaver for Turbo Coding

    DEFF Research Database (Denmark)

    Andersen, Jakob Dahl

    1998-01-01

    The selection of component codes for turbo coding has often been based on the performance at high SNR's. However, we will argue that the selection mainly should be based on the performance at low SNR's, i.e. the convergence properties. Further, we will present a way to construct interleavers...... that significantly improve the performance of the turbo coding scheme at high SNR's, i.e. lowers the error floor...

  12. 22 CFR 40.91 - Certain aliens previously removed.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  13. Derivation of the physical equations solved in the inertial confinement stability code DOC. Informal report

    International Nuclear Information System (INIS)

    Scannapieco, A.J.; Cranfill, C.W.

    1978-11-01

    There now exists an inertial confinement stability code called DOC, which runs as a postprocessor. DOC (a code that has evolved from a previous code, PANSY) is a spherical harmonic linear stability code that integrates, in time, a set of Lagrangian perturbation equations. Effects due to real equations of state, asymmetric energy deposition, thermal conduction, shock propagation, and a time-dependent zeroth-order state are handled in the code. We present here a detailed derivation of the physical equations that are solved in the code

  14. Multiple priming instances increase the impact of practice-based but not verbal code-based stimulus-response associations.

    Science.gov (United States)

    Pfeuffer, Christina U; Moutsopoulou, Karolina; Waszak, Florian; Kiesel, Andrea

    2017-05-13

    Stimulus-response (S-R) associations, the basis of learning and behavioral automaticity, are formed by the (repeated) co-occurrence of stimuli and responses and render stimuli able to automatically trigger associated responses. The strength and behavioral impact of these S-R associations increases with the number of priming instances (i.e., practice). Here we investigated whether multiple priming instances of a special form of instruction, verbal coding, also lead to the formation of stronger S-R associations in comparison to a single instance of priming. Participants either actively classified stimuli or passively attended to verbal codes denoting responses once or four times before S-R associations were probed. We found that whereas S-R associations formed on the basis of active task execution (i.e., practice) were strengthened by multiple priming instances, S-R associations formed on the basis of verbal codes (i.e., instruction) did not benefit from additional priming instances. These findings indicate difference in the mechanisms underlying the encoding and/or retrieval of previously executed and verbally coded S-R associations. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Polynomial Batch Codes for Efficient IT-PIR

    Directory of Open Access Journals (Sweden)

    Henry Ryan

    2016-10-01

    Full Text Available Private information retrieval (PIR is a way for clients to query a remote database without the database holder learning the clients’ query terms or the responses they generate. Compelling applications for PIR are abound in the cryptographic and privacy research literature, yet existing PIR techniques are notoriously inefficient. Consequently, no such PIRbased application to date has seen real-world at-scale deployment. This paper proposes new “batch coding” techniques to help address PIR’s efficiency problem. The new techniques exploit the connection between ramp secret sharing schemes and efficient information-theoretically secure PIR (IT-PIR protocols. This connection was previously observed by Henry, Huang, and Goldberg (NDSS 2013, who used ramp schemes to construct efficient “batch queries” with which clients can fetch several database records for the same cost as fetching a single record using a standard, non-batch query. The new techniques in this paper generalize and extend those of Henry et al. to construct “batch codes” with which clients can fetch several records for only a fraction the cost of fetching a single record using a standard non-batch query over an unencoded database. The batch codes are highly tuneable, providing a means to trade off (i lower server-side computation cost, (ii lower server-side storage cost, and/or (iii lower uni- or bi-directional communication cost, in exchange for a comparatively modest decrease in resilience to Byzantine database servers.

  16. The Impact of Codes of Conduct on Stakeholders

    Science.gov (United States)

    Newman, Wayne R.

    2015-01-01

    The purpose of this study was to determine how an urban school district's code of conduct aligned with actual school/class behaviors, and how stakeholders perceived the ability of this document to achieve its number one goal: safe and productive learning environments. Twenty participants including students, teachers, parents, and administrators…

  17. Maybe it's not Python that sucks, maybe it's my code

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Did you know that in Python integers from -5 to 257 are preallocated? Reusing them 1000 times, instead of allocating memory for a bigger integer, saves a whopping 1 millisecond of code's execution time! Isn't that thrilling? Well, before you get that crazy, learn some basic performance tricks that you can start using today.

  18. RunJumpCode: An Educational Game for Educating Programming

    Science.gov (United States)

    Hinds, Matthew; Baghaei, Nilufar; Ragon, Pedrito; Lambert, Jonathon; Rajakaruna, Tharindu; Houghton, Travers; Dacey, Simon

    2017-01-01

    Programming promotes critical thinking, problem solving and analytic skills through creating solutions that can solve everyday problems. However, learning programming can be a daunting experience for a lot of students. "RunJumpCode" is an educational 2D platformer video game, designed and developed in Unity, to teach players the…

  19. Diglossia and Code Switching in Nigeria: Implications for English ...

    African Journals Online (AJOL)

    FIRST LADY

    implications of the diglossic situations in Nigeria for English language teaching and learning. Key words: Diglossia, bilinguals, multilinguals, code-switching. Introduction. The act of choosing a language or variety with which to communicate, at any gives time is a common feature of bilingual or multilingual societies. In such.

  20. Bit-Scalable Deep Hashing With Regularized Similarity Learning for Image Retrieval and Person Re-Identification.

    Science.gov (United States)

    Zhang, Ruimao; Lin, Liang; Zhang, Rui; Zuo, Wangmeng; Zhang, Lei

    2015-12-01

    Extracting informative image features and learning effective approximate hashing functions are two crucial steps in image retrieval. Conventional methods often study these two steps separately, e.g., learning hash functions from a predefined hand-crafted feature space. Meanwhile, the bit lengths of output hashing codes are preset in the most previous methods, neglecting the significance level of different bits and restricting their practical flexibility. To address these issues, we propose a supervised learning framework to generate compact and bit-scalable hashing codes directly from raw images. We pose hashing learning as a problem of regularized similarity learning. In particular, we organize the training images into a batch of triplet samples, each sample containing two images with the same label and one with a different label. With these triplet samples, we maximize the margin between the matched pairs and the mismatched pairs in the Hamming space. In addition, a regularization term is introduced to enforce the adjacency consistency, i.e., images of similar appearances should have similar codes. The deep convolutional neural network is utilized to train the model in an end-to-end fashion, where discriminative image features and hash functions are simultaneously optimized. Furthermore, each bit of our hashing codes is unequally weighted, so that we can manipulate the code lengths by truncating the insignificant bits. Our framework outperforms state-of-the-arts on public benchmarks of similar image search and also achieves promising results in the application of person re-identification in surveillance. It is also shown that the generated bit-scalable hashing codes well preserve the discriminative powers with shorter code lengths.

  1. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  2. A Case for Dynamic Reverse-code Generation

    DEFF Research Database (Denmark)

    Lee, Jooyong

    2007-01-01

    Backtracking (i.e. reverse execution) helps the user of a debugger to naturally think backwards along the execution path of a program, and thinking backwards makes it easy to locate the origin of a bug. So far backtracking has been implemented mostly by state saving or by checkpointing....... These implementations, however, inherently do not scale. As has often been said, the ultimate solution for backtracking is to use reverse code: executing the reverse code restores the previous states of a program. In our earlier work, we presented a method to generate reverse code on the fly while running a debugger....... This article presents a case study of dynamic reverse-code generation. We compare the memory usage of various backtracking methods in a simple but nontrivial example, a bounded-buffer program. In the case of non-deterministic programs such as this bounded-buffer program, our dynamic reverse-code generation can...

  3. A Case for Increased Training in the Nemeth Code of Braille Mathematics for Teachers of Students Who Are Visually Impaired.

    Science.gov (United States)

    Kapperman, Gaylen; Sticken, Jodi

    2003-01-01

    This article discusses the lack of preparation teachers of students with visual impairments have in the Nemeth Code (the Braille code for mathematics). It then describes a Windows-based tutorial for sighted persons to learn the Nemeth Code, a tutorial for teachers with blindness, and how to access the tutorials. (Contains 10 references.) (CR)

  4. A Memory Reduction Approach for MPEG Video Coding

    OpenAIRE

    Kajiwara, Naoki; Moshnyaga, Vasily G.

    2003-01-01

    This paper presents an architectural enhancement for reducing memory requirements of MPEG video coding based on incremental memory sharing between the reconstructed picture frames. The method exploits the temporal locality of block-based hybrid coding by dynamically replacing the processed macroblocks of the previously reconstructed picture frame with macroblocks of a newly reconstructed picture frame. Simulation results show that using this method we can reduce the total memory size by a 13%...

  5. The flat's co-property in the new civil code

    OpenAIRE

    Velas, Michal

    2015-01-01

    This thesis analyzes two issues. In the first, more extensive and more significant issue the author focuses on residency co-ownership in new Civil Code. In comparison with previous legislation, the new one brings slight shift in understanding of this comprehensive legal institute. New Civil Code uses term residency co-ownership instead of flat ownership, thus, apart from securing housing needs, emphasizing house co-ownership. Part of this thesis is dedicated to accessory co-ownership, as spec...

  6. Tail Biting Trellis Representation of Codes: Decoding and Construction

    Science.gov (United States)

    Shao. Rose Y.; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents two new iterative algorithms for decoding linear codes based on their tail biting trellises, one is unidirectional and the other is bidirectional. Both algorithms are computationally efficient and achieves virtually optimum error performance with a small number of decoding iterations. They outperform all the previous suboptimal decoding algorithms. The bidirectional algorithm also reduces decoding delay. Also presented in the paper is a method for constructing tail biting trellises for linear block codes.

  7. Overlaid Alice: a statistical model computer code including fission and preequilibrium models. [FORTRAN, cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Blann, M.

    1976-01-01

    The most recent edition of an evaporation code originally written previously with frequent updating and improvement. This version replaces the version Alice described previously. A brief summary is given of the types of calculations which can be done. A listing of the code and the results of several sample calculations are presented. (JFP)

  8. Stochastic Decoding of Turbo Codes

    OpenAIRE

    Dong , Q. T.; ARZEL , Matthieu; Jego , Christophe; Gross , W. J.

    2010-01-01

    International audience; Stochastic computation is a technique in which operations on probabilities are performed on random bit streams. Stochastic decoding of forward error-correction (FEC) codes is inspired by this technique. This paper extends the application of the stochastic decoding approach to the families of convolutional codes and turbo codes. It demonstrates that stochastic computation is a promising solution to improve the data throughput of turbo decoders with very simple implement...

  9. Evaluation of an electrocardiogram on QR code.

    Science.gov (United States)

    Nakayama, Masaharu; Shimokawa, Hiroaki

    2013-01-01

    An electrocardiogram (ECG) is an indispensable tool to diagnose cardiac diseases, such as ischemic heart disease, myocarditis, arrhythmia, and cardiomyopathy. Since ECG patterns vary depend on patient status, it is also used to monitor patients during treatment and comparison with ECGs with previous results is important for accurate diagnosis. However, the comparison requires connection to ECG data server in a hospital and the availability of data connection among hospitals is limited. To improve the portability and availability of ECG data regardless of server connection, we here introduce conversion of ECG data into 2D barcodes as text data and decode of the QR code for drawing ECG with Google Chart API. Fourteen cardiologists and six general physicians evaluated the system using iPhone and iPad. Overall, they were satisfied with the system in usability and accuracy of decoded ECG compared to the original ECG. This new coding system may be useful in utilizing ECG data irrespective of server connections.

  10. Integrating Women into Previously All Male Air Force Units.

    Science.gov (United States)

    1980-05-31

    structure and anomie ." In Social Theory and Social Structure. Glencoe, IL: Free Press. Minnigerode, Fred A. 1976 "Attitudes toward women, sex role...learning and reinforcement theory where clear distinctions are made in terms of the quality and quantity of learning which take place in conditions...females. Thus, these young men become more willing to attribute life events to forces beyond their control and if Rotter’s theory is correct they also

  11. Temporal-pattern learning in neural models

    CERN Document Server

    Genís, Carme Torras

    1985-01-01

    While the ability of animals to learn rhythms is an unquestionable fact, the underlying neurophysiological mechanisms are still no more than conjectures. This monograph explores the requirements of such mechanisms, reviews those previously proposed and postulates a new one based on a direct electric coding of stimulation frequencies. Experi­ mental support for the option taken is provided both at the single neuron and neural network levels. More specifically, the material presented divides naturally into four parts: a description of the experimental and theoretical framework where this work becomes meaningful (Chapter 2), a detailed specifica­ tion of the pacemaker neuron model proposed together with its valida­ tion through simulation (Chapter 3), an analytic study of the behavior of this model when submitted to rhythmic stimulation (Chapter 4) and a description of the neural network model proposed for learning, together with an analysis of the simulation results obtained when varying seve­ ral factors r...

  12. QR code for medical information uses.

    Science.gov (United States)

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  13. Ultrasound imaging using coded signals

    DEFF Research Database (Denmark)

    Misaridis, Athanasios

    coded excitation can be used for increasing the frame rate. The work includes both simulated results using Field II, and experimental results based on measurements on phantoms as well as clinical images. Initially a mathematical foundation of signal modulation is given. Pulse compression based...... is described. Application of coded excitation in array imaging is evaluated through simulations in Field II. The low degree of the orthogonality among coded signals for ultrasound systems is first discussed, and the effect of mismatched filtering in the cross-correlation properties of the signals is evaluated...... emissions. Finally, a novel coding technique which uses pulse train excitation is presented....

  14. Grassmann codes and Schubert unions

    DEFF Research Database (Denmark)

    Hansen, Johan Peder; Johnsen, Trygve; Ranestad, Kristian

    2009-01-01

    We study subsets of Grassmann varieties over a field , such that these subsets are unions of Schubert cycles, with respect to a fixed flag. We study such sets in detail, and give applications to coding theory, in particular for Grassmann codes. For much is known about such Schubert unions...... with a maximal number of -rational points for a given spanning dimension. We study the case and give a conjecture for general . We also define Schubert union codes in general, and study the parameters and support weights of these codes....

  15. High Order Modulation Protograph Codes

    Science.gov (United States)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  16. Coding, cryptography and combinatorics

    CERN Document Server

    Niederreiter, Harald; Xing, Chaoping

    2004-01-01

    It has long been recognized that there are fascinating connections between cod­ ing theory, cryptology, and combinatorics. Therefore it seemed desirable to us to organize a conference that brings together experts from these three areas for a fruitful exchange of ideas. We decided on a venue in the Huang Shan (Yellow Mountain) region, one of the most scenic areas of China, so as to provide the additional inducement of an attractive location. The conference was planned for June 2003 with the official title Workshop on Coding, Cryptography and Combi­ natorics (CCC 2003). Those who are familiar with events in East Asia in the first half of 2003 can guess what happened in the end, namely the conference had to be cancelled in the interest of the health of the participants. The SARS epidemic posed too serious a threat. At the time of the cancellation, the organization of the conference was at an advanced stage: all invited speakers had been selected and all abstracts of contributed talks had been screened by the p...

  17. Coded aperture tomography revisited

    International Nuclear Information System (INIS)

    Bizais, Y.; Rowe, R.W.; Zubal, I.G.; Bennett, G.W.; Brill, A.B.

    1983-01-01

    Coded aperture (CA) Tomography never achieved wide spread use in Nuclear Medicine, except for the degenerate case of Seven Pinhole tomagraphy (7PHT). However it enjoys several attractive features (high sensitivity and tomographic ability with a statis detector). On the other hand, resolution is usually poor especially along the depth axis and the reconstructed volume is rather limited. Arguments are presented justifying the position that CA tomography can be useful for imaging time-varying 3D structures, if its major drawbacks (poor longitudinal resolution and difficulty in quantification) are overcome. Poor results obtained with 7PHT can be explained by both a very limited angular range sampled and a crude modelling of the image formation process. Therefore improvements can be expected by the use of a dual-detector system, along with a better understanding of its sampling properties and the use of more powerful reconstruction algorithms. Non overlapping multipinhole plates, because they do not involve a decoding procedure, should be considered first for practical applications. Use of real CA should be considered for cases in which non overlapping multipinhole plates do not lead to satisfactory solutions. We have been and currently are carrying out theoretical and experimental works, in order to define the factors which limit CA imaging and to propose satisfactory solutions for Dynamic Emission Tomography

  18. Computer code abstract: NESTLE

    International Nuclear Information System (INIS)

    Turinsky, P.J.; Al-Chalabi, R.M.K.; Engrand, P.; Sarsour, H.N.; Faure, F.X.; Guo, W.

    1995-01-01

    NESTLE is a few-group neutron diffusion equation solver utilizing the nodal expansion method (NEM) for eigenvalue, adjoint, and fixed-source steady-state and transient problems. The NESTLE code solve the eigenvalue (criticality), eigenvalue adjoint, external fixed-source steady-state, and external fixed-source or eigenvalue initiated transient problems. The eigenvalue problem allows criticality searches to be completed, and the external fixed-source steady-state problem can search to achieve a specified power level. Transient problems model delayed neutrons via precursor groups. Several core properties can be input as time dependent. Two- or four-energy groups can be utilized, with all energy groups being thermal groups (i.e., upscatter exits) is desired. Core geometries modeled include Cartesian and hexagonal. Three-, two-, and one-dimensional models can be utilized with various symmetries. The thermal conditions predicted by the thermal-hydraulic model of the core are used to correct cross sections for temperature and density effects. Cross sections for temperature and density effects. Cross sections are parameterized by color, control rod state (i.e., in or out), and burnup, allowing fuel depletion to be modeled. Either a macroscopic or microscopic model may be employed

  19. Bounce-averaged Fokker-Planck code for stellarator transport

    International Nuclear Information System (INIS)

    Mynick, H.E.; Hitchon, W.N.G.

    1985-07-01

    A computer code for solving the bounce-averaged Fokker-Planck equation appropriate to stellarator transport has been developed, and its first applications made. The code is much faster than the bounce-averaged Monte-Carlo codes, which up to now have provided the most efficient numerical means for studying stellarator transport. Moreover, because the connection to analytic kinetic theory of the Fokker-Planck approach is more direct than for the Monte-Carlo approach, a comparison of theory and numerical experiment is now possible at a considerably more detailed level than previously

  20. Robust image transmission performed by SPIHT and turbo-codes

    Directory of Open Access Journals (Sweden)

    Lakhdar Moulay Abdelmounaim

    2008-01-01

    Full Text Available This work describes the method for providing robustness to errors from a binary symmetric channel for the SPIHT image compression. The source rate and channel rate are jointly optimized by a stream of fixed-size channel packets. Punctured turbo codes are used for the channel coding, providing stronger error protection than previously available codes. We use the most appropriate set of puncturing patterns that ensure the best source rate. The presented rate allocation scheme obtains all necessary information from the SPIHT encoder, without requiring image decompression.

  1. Empirical Evaluation of Superposition Coded Multicasting for Scalable Video

    KAUST Repository

    Chun Pong Lau

    2013-03-01

    In this paper we investigate cross-layer superposition coded multicast (SCM). Previous studies have proven its effectiveness in exploiting better channel capacity and service granularities via both analytical and simulation approaches. However, it has never been practically implemented using a commercial 4G system. This paper demonstrates our prototype in achieving the SCM using a standard 802.16 based testbed for scalable video transmissions. In particular, to implement the superposition coded (SPC) modulation, we take advantage a novel software approach, namely logical SPC (L-SPC), which aims to mimic the physical layer superposition coded modulation. The emulation results show improved throughput comparing with generic multicast method.

  2. Investigating the Simulink Auto-Coding Process

    Science.gov (United States)

    Gualdoni, Matthew J.

    2016-01-01

    Ramses model that should be further investigated. Several skills were required to be built up over the course of the internship project. First and foremost, my Simulink skills have improved drastically, as much of my experience had been modeling electronic circuits as opposed to software models. Furthermore, I am now comfortable working with the Simulink Auto-coder, a tool I had never used until this summer; this tool also tested my critical thinking and C++ knowledge as I had to interpret the C++ code it was generating and attempt to understand how the Simulink model affected the generated code. I had come into the internship with a solid understanding of Matlab code, but had done very little in using it to automate tasks, particularly Simulink tasks; along the same lines, I had rarely used shell script to automate and interface with programs, which I gained a fair amount of experience with this summer, including how to use regular expression. Lastly, soft-skills are an area everyone can continuously improve on; having never worked with NASA engineers, which to me seem to be a completely different breed than what I am used to (commercial electronic engineers), I learned to utilize the wealth of knowledge present at JSC. I wish I had come into the internship knowing exactly how helpful everyone in my branch would be, as I would have picked up on this sooner. I hope that having gained such a strong foundation in Simulink over this summer will open the opportunity to return to work on this project, or potentially other opportunities within the division. The idea of leaving a project I devoted ten weeks to is a hard one to cope with, so having the chance to pick up where I left off sounds appealing; alternatively, I am interested to see if there are any opening in the future that would allow me to work on a project that is more in-line with my research in estimation algorithms. Regardless, this summer has been a milestone in my professional career, and I hope this has

  3. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    In 1948, C. Shannon developed fundamental limits on the efficiency of communication over noisy channels. However it is only in 1993 (about half a century later) that Berrou, Glavieux and Thitimajshima developed turbo codes and demonstrated performance close to that limit. Overnight, much of the algebraic coding ...

  4. Morse code recognition system with fuzzy algorithm for disabled persons.

    Science.gov (United States)

    Wu, C-M; Luo, C-H

    2002-01-01

    It is generally known that Morse code is an efficient input method for one or two switches and it is made from long and short sounds separated by silence between the sounds. The long-to-short ratio in the definition is always 3 to 1, but the long-to-short ratio variation for a disabled person is so large that it is difficult to recognize. In the last few years, several Morse code recognition methods have been successfully built on the LMS adaptive algorithms and neural network algorithm. But LMS-related adaptive algorithms need mass computation to infer the characteristic of the controller; also the neural network must learn first, by inputting some data before it is used to recognize the Morse code sequence. In this study, two fuzzy algorithms are used to recognize the unstable Morse code sequences and the result demonstrates a significant improvement of recognition for real time signal processing in a single-chip microprocessor.

  5. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer

    2017-12-25

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images independently. However, learning multidimensional dictionaries and sparse codes for the reconstruction of multi-dimensional data is very important, as it examines correlations among all the data jointly. This provides more capacity for the learned dictionaries to better reconstruct data. In this paper, we propose a generic and novel formulation for the CSC problem that can handle an arbitrary order tensor of data. Backed with experimental results, our proposed formulation can not only tackle applications that are not possible with standard CSC solvers, including colored video reconstruction (5D- tensors), but it also performs favorably in reconstruction with much fewer parameters as compared to naive extensions of standard CSC to multiple features/channels.

  6. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    Science.gov (United States)

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Allele coding in genomic evaluation

    Directory of Open Access Journals (Sweden)

    Christensen Ole F

    2011-06-01

    Full Text Available Abstract Background Genomic data are used in animal breeding to assist genetic evaluation. Several models to estimate genomic breeding values have been studied. In general, two approaches have been used. One approach estimates the marker effects first and then, genomic breeding values are obtained by summing marker effects. In the second approach, genomic breeding values are estimated directly using an equivalent model with a genomic relationship matrix. Allele coding is the method chosen to assign values to the regression coefficients in the statistical model. A common allele coding is zero for the homozygous genotype of the first allele, one for the heterozygote, and two for the homozygous genotype for the other allele. Another common allele coding changes these regression coefficients by subtracting a value from each marker such that the mean of regression coefficients is zero within each marker. We call this centered allele coding. This study considered effects of different allele coding methods on inference. Both marker-based and equivalent models were considered, and restricted maximum likelihood and Bayesian methods were used in inference. Results Theoretical derivations showed that parameter estimates and estimated marker effects in marker-based models are the same irrespective of the allele coding, provided that the model has a fixed general mean. For the equivalent models, the same results hold, even though different allele coding methods lead to different genomic relationship matrices. Calculated genomic breeding values are independent of allele coding when the estimate of the general mean is included into the values. Reliabilities of estimated genomic breeding values calculated using elements of the inverse of the coefficient matrix depend on the allele coding because different allele coding methods imply different models. Finally, allele coding affects the mixing of Markov chain Monte Carlo algorithms, with the centered coding being

  8. MARS Code in Linux Environment

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong

    2005-01-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  9. Civil Code, 11 December 1987.

    Science.gov (United States)

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  10. MARS Code in Linux Environment

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, Sung Won; Jung, Jae Joon; Chung, Bub Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2005-07-01

    The two-phase system analysis code MARS has been incorporated into Linux system. The MARS code was originally developed based on the RELAP5/MOD3.2 and COBRA-TF. The 1-D module which evolved from RELAP5 alone could be applied for the whole NSSS system analysis. The 3-D module developed based on the COBRA-TF, however, could be applied for the analysis of the reactor core region where 3-D phenomena would be better treated. The MARS code also has several other code units that could be incorporated for more detailed analysis. The separate code units include containment analysis modules and 3-D kinetics module. These code modules could be optionally invoked to be coupled with the main MARS code. The containment code modules (CONTAIN and CONTEMPT), for example, could be utilized for the analysis of the plant containment phenomena in a coupled manner with the nuclear reactor system. The mass and energy interaction during the hypothetical coolant leakage accident could, thereby, be analyzed in a more realistic manner. In a similar way, 3-D kinetics could be incorporated for simulating the three dimensional reactor kinetic behavior, instead of using the built-in point kinetics model. The MARS code system, developed initially for the MS Windows environment, however, would not be adequate enough for the PC cluster system where multiple CPUs are available. When parallelism is to be eventually incorporated into the MARS code, MS Windows environment is not considered as an optimum platform. Linux environment, on the other hand, is generally being adopted as a preferred platform for the multiple codes executions as well as for the parallel application. In this study, MARS code has been modified for the adaptation of Linux platform. For the initial code modification, the Windows system specific features have been removed from the code. Since the coupling code module CONTAIN is originally in a form of dynamic load library (DLL) in the Windows system, a similar adaptation method

  11. PSNR Improvement Using Different Prediction Coding

    Directory of Open Access Journals (Sweden)

    Khamies Khalaf Hasan

    2015-02-01

    Full Text Available Differential Pulse Code Modulation (DPCM is one of the predictive coding techniques. The number of previous pixels employed in the estimate operation is referred to as the order of the predictor. Predictor using one pixel for estimation is called “first order predictor”. A “second order predictor” utilizes two pixels and an “nth order predictor” would employ n previous pixels.From the results computed in this work, by testing the prediction mean square error (MSE using different numbers of previous picture elements. The results show that the MSE decreases significantly by using up to three pixels, and further decreases of MSE are rather small by using more than three pixels that means the performance improvement becomes negligible and only a marginal gain beyond a third-order predictor can be achieved. That means, Peak Signal to Noise Ratio (PSNR increases significantly by increasing the predictor order, the performance improvement becomes negligible beyond third order predictor.

  12. Computational Approaches Reveal New Insights into Regulation and Function of Non; coding RNAs and their Targets

    KAUST Repository

    Alam, Tanvir

    2016-11-28

    Regulation and function of protein-coding genes are increasingly well-understood, but no comparable evidence exists for non-coding RNA (ncRNA) genes, which appear to be more numerous than protein-coding genes. We developed a novel machine-learning model to distinguish promoters of long ncRNA (lncRNA) genes from those of protein-coding genes. This represents the first attempt to make this distinction based on properties of the associated gene promoters. From our analyses, several transcription factors (TFs), which are known to be regulated by lncRNAs, also emerged as potential global regulators of lncRNAs, suggesting that lncRNAs and TFs may participate in bidirectional feedback regulatory network. Our results also raise the possibility that, due to the historical dependence on protein-coding gene in defining the chromatin states of active promoters, an adjustment of these chromatin signature profiles to incorporate lncRNAs is warranted in the future. Secondly, we developed a novel method to infer functions for lncRNA and microRNA (miRNA) transcripts based on their transcriptional regulatory networks in 119 tissues and 177 primary cells of human. This method for the first time combines information of cell/tissueVspecific expression of a transcript and the TFs and transcription coVfactors (TcoFs) that control activation of that transcript. Transcripts were annotated using statistically enriched GO terms, pathways and diseases across cells/tissues and associated knowledgebase (FARNA) is developed. FARNA, having the most comprehensive function annotation of considered ncRNAs across the widest spectrum of cells/tissues, has a potential to contribute to our understanding of ncRNA roles and their regulatory mechanisms in human. Thirdly, we developed a novel machine-learning model to identify LD motif (a protein interaction motif) of paxillin, a ncRNA target that is involved in cell motility and cancer metastasis. Our recognition model identified new proteins not

  13. Learning during Processing: Word Learning Doesn't Wait for Word Recognition to Finish

    Science.gov (United States)

    Apfelbaum, Keith S.; McMurray, Bob

    2017-01-01

    Previous research on associative learning has uncovered detailed aspects of the process, including what types of things are learned, how they are learned, and where in the brain such learning occurs. However, perceptual processes, such as stimulus recognition and identification, take time to unfold. Previous studies of learning have not addressed…

  14. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  15. Cyclic codes of length 2

    Indian Academy of Sciences (India)

    Springer Verlag Heidelberg #4 2048 1996 Dec 15 10:16:45

    [X]/〈X2m. − 1〉 are given. Cyclic codes of length 2m over the finite field Fq, of odd characteristic, are defined in terms of their generator polynomials. The exact minimum distance and the dimension of the codes are obtained. Keywords.

  16. Code breaking in the pacific

    CERN Document Server

    Donovan, Peter

    2014-01-01

    Covers the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945 Describes, explains and analyzes the code breaking techniques developed during the war in the Pacific Exposes the blunders (in code construction and use) made by the Japanese Navy that led to significant US Naval victories

  17. NETWORK CODING BY BEAM FORMING

    DEFF Research Database (Denmark)

    2013-01-01

    Network coding by beam forming in networks, for example, in single frequency networks, can provide aid in increasing spectral efficiency. When network coding by beam forming and user cooperation are combined, spectral efficiency gains may be achieved. According to certain embodiments, a method...

  18. On Network Coded Filesystem Shim

    DEFF Research Database (Denmark)

    Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani; Médard, Muriel

    2017-01-01

    benefits to any application in a computer. However, incorporating new protocols to the Internet is a challenging and slow process. Second, deploying coding at the application layer, which forces each application to implement network coding. This paper proposes an alternative approach through the use...

  19. LFSC - Linac Feedback Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  20. The reactor dynamic code RETRANS

    International Nuclear Information System (INIS)

    Kamelander, G.; Woloch, F.; Sdouz, G.; Koinig, H.

    1984-08-01

    This report gives a general view on the reactor dynamic code RETRANS. The subroutines and common-blocks are described in detail to facilitate code-modifications and improvements of physical models or numerical methods. Furthermore the report contains an users guide. Finally some test examples are given. The physical meaning of the results is discussed. (Author) [de