WorldWideScience

Sample records for code transformation task

  1. Performance measures for transform data coding.

    Science.gov (United States)

    Pearl, J.; Andrews, H. C.; Pratt, W. K.

    1972-01-01

    This paper develops performance criteria for evaluating transform data coding schemes under computational constraints. Computational constraints that conform with the proposed basis-restricted model give rise to suboptimal coding efficiency characterized by a rate-distortion relation R(D) similar in form to the theoretical rate-distortion function. Numerical examples of this performance measure are presented for Fourier, Walsh, Haar, and Karhunen-Loeve transforms.

  2. Transition from Target to Gaze Coding in Primate Frontal Eye Field during Memory Delay and Memory–Motor Transformation123

    Science.gov (United States)

    Sajad, Amirsaman; Sadeh, Morteza; Yan, Xiaogang; Wang, Hongying

    2016-01-01

    Abstract The frontal eye fields (FEFs) participate in both working memory and sensorimotor transformations for saccades, but their role in integrating these functions through time remains unclear. Here, we tracked FEF spatial codes through time using a novel analytic method applied to the classic memory-delay saccade task. Three-dimensional recordings of head-unrestrained gaze shifts were made in two monkeys trained to make gaze shifts toward briefly flashed targets after a variable delay (450-1500 ms). A preliminary analysis of visual and motor response fields in 74 FEF neurons eliminated most potential models for spatial coding at the neuron population level, as in our previous study (Sajad et al., 2015). We then focused on the spatiotemporal transition from an eye-centered target code (T; preferred in the visual response) to an eye-centered intended gaze position code (G; preferred in the movement response) during the memory delay interval. We treated neural population codes as a continuous spatiotemporal variable by dividing the space spanning T and G into intermediate T–G models and dividing the task into discrete steps through time. We found that FEF delay activity, especially in visuomovement cells, progressively transitions from T through intermediate T–G codes that approach, but do not reach, G. This was followed by a final discrete transition from these intermediate T–G delay codes to a “pure” G code in movement cells without delay activity. These results demonstrate that FEF activity undergoes a series of sensory–memory–motor transformations, including a dynamically evolving spatial memory signal and an imperfect memory-to-motor transformation. PMID:27092335

  3. Motion-adaptive intraframe transform coding of video signals

    NARCIS (Netherlands)

    With, de P.H.N.

    1989-01-01

    Spatial transform coding has been widely applied for image compression because of its high coding efficiency. However, in many intraframe systems, in which every TV frame is independently processed, coding of moving objects in the case of interlaced input signals is not addressed. In this paper, we

  4. Human Motion Capture Data Tailored Transform Coding.

    Science.gov (United States)

    Junhui Hou; Lap-Pui Chau; Magnenat-Thalmann, Nadia; Ying He

    2015-07-01

    Human motion capture (mocap) is a widely used technique for digitalizing human movements. With growing usage, compressing mocap data has received increasing attention, since compact data size enables efficient storage and transmission. Our analysis shows that mocap data have some unique characteristics that distinguish themselves from images and videos. Therefore, directly borrowing image or video compression techniques, such as discrete cosine transform, does not work well. In this paper, we propose a novel mocap-tailored transform coding algorithm that takes advantage of these features. Our algorithm segments the input mocap sequences into clips, which are represented in 2D matrices. Then it computes a set of data-dependent orthogonal bases to transform the matrices to frequency domain, in which the transform coefficients have significantly less dependency. Finally, the compression is obtained by entropy coding of the quantized coefficients and the bases. Our method has low computational cost and can be easily extended to compress mocap databases. It also requires neither training nor complicated parameter setting. Experimental results demonstrate that the proposed scheme significantly outperforms state-of-the-art algorithms in terms of compression performance and speed.

  5. A progressive data compression scheme based upon adaptive transform coding: Mixture block coding of natural images

    Science.gov (United States)

    Rost, Martin C.; Sayood, Khalid

    1991-01-01

    A method for efficiently coding natural images using a vector-quantized variable-blocksized transform source coder is presented. The method, mixture block coding (MBC), incorporates variable-rate coding by using a mixture of discrete cosine transform (DCT) source coders. Which coders are selected to code any given image region is made through a threshold driven distortion criterion. In this paper, MBC is used in two different applications. The base method is concerned with single-pass low-rate image data compression. The second is a natural extension of the base method which allows for low-rate progressive transmission (PT). Since the base method adapts easily to progressive coding, it offers the aesthetic advantage of progressive coding without incorporating extensive channel overhead. Image compression rates of approximately 0.5 bit/pel are demonstrated for both monochrome and color images.

  6. Effect of task demands on dual coding of pictorial stimuli.

    Science.gov (United States)

    Babbitt, B C

    1982-01-01

    Recent studies have suggested that verbal labeling of a picture does not occur automatically. Although several experiments using paired-associate tasks produced little evidence indicating the use of a verbal code with picture stimuli, the tasks were probably not sensitive to whether the codes were activated initially. It is possible that verbal labels were activated at input, but not used later in performing the tasks. The present experiment used a color-naming interference task in order to assess, with a more sensitive measure, the amount of verbal coding occurring in response to word or picture input. Subjects named the color of ink in which words were printed following either word or picture input. If verbal labeling of the input occurs, then latency of color naming should increase when the input item and color-naming word are related. The results provided substantial evidence of such verbal activation when the input items were words. However, the presence of verbal activation with picture input was a function of task demands. Activation occurred when a recall memory test was used, but not when a recognition memory test was used. The results support the conclusion that name information (labels) need not be activated during presentation of visual stimuli.

  7. High-radix transforms for Reed-Solomon codes over Fermat primes

    Science.gov (United States)

    Liu, K. Y.; Reed, I. S.; Truong, T. K.

    1977-01-01

    A method is proposed to streamline the transform decoding algorithm for Reed-Solomon (RS) codes of length equal to 2 raised to the power 2n. It is shown that a high-radix fast Fourier transform (FFT) type algorithm with generator equal to 3 on GF(F sub n), where F sub n is a Fermat prime, can be used to decode RS codes of this length. For a 256-symbol RS code, a radix 4 and radix 16 FFT over GF(F sub 3) require, respectively, 30 and 70% fewer modulo F sub n multiplications than the usual radix 2 FFT.

  8. Code Generation by Model Transformation : A Case Study in Transformation Modularity

    NARCIS (Netherlands)

    Hemel, Z.; Kats, L.C.L.; Visser, E.

    2008-01-01

    Preprint of paper published in: Theory and Practice of Model Transformations (ICMT 2008), Lecture Notes in Computer Science 5063; doi:10.1007/978-3-540-69927-9_13 The realization of model-driven software development requires effective techniques for implementing code generators for domain-specific

  9. Feasibility Study for Applicability of the Wavelet Transform to Code Accuracy Quantification

    International Nuclear Information System (INIS)

    Kim, Jong Rok; Choi, Ki Yong

    2012-01-01

    A purpose of the assessment process of large thermal-hydraulic system codes is verifying their quality by comparing code predictions against experimental data. This process is essential for reliable safety analysis of nuclear power plants. Extensive experimental programs have been conducted in order to support the development and validation activities of best estimate thermal-hydraulic codes. So far, the Fast Fourier Transform Based Method (FFTBM) has been used widely for quantification of the prediction accuracy regardless of its limitation that it does not provide any time resolution for a local event. As alternative options, several time windowing methods (running average, short time Fourier transform, and etc.) can be utilized, but such time windowing methods also have a limitation of a fixed resolution. This limitation can be overcome by a wavelet transform because the resolution of the wavelet transform effectively varies in the time-frequency plane depending on choice of basic functions which are not necessarily sinusoidal. In this study, a feasibility of a new code accuracy quantification methodology using the wavelet transform is pursued

  10. QR code-based non-linear image encryption using Shearlet transform and spiral phase transform

    Science.gov (United States)

    Kumar, Ravi; Bhaduri, Basanta; Hennelly, Bryan

    2018-02-01

    In this paper, we propose a new quick response (QR) code-based non-linear technique for image encryption using Shearlet transform (ST) and spiral phase transform. The input image is first converted into a QR code and then scrambled using the Arnold transform. The scrambled image is then decomposed into five coefficients using the ST and the first Shearlet coefficient, C1 is interchanged with a security key before performing the inverse ST. The output after inverse ST is then modulated with a random phase mask and further spiral phase transformed to get the final encrypted image. The first coefficient, C1 is used as a private key for decryption. The sensitivity of the security keys is analysed in terms of correlation coefficient and peak signal-to noise ratio. The robustness of the scheme is also checked against various attacks such as noise, occlusion and special attacks. Numerical simulation results are shown in support of the proposed technique and an optoelectronic set-up for encryption is also proposed.

  11. Transformational leadership and task cohesion in sport: the mediating role of inside sacrifice.

    Science.gov (United States)

    Cronin, Lorcan Donal; Arthur, Calum Alexander; Hardy, James; Callow, Nichola

    2015-02-01

    In this cross-sectional study, we examined a mediational model whereby transformational leadership is related to task cohesion via sacrifice. Participants were 381 American (Mage = 19.87 years, SD = 1.41) Division I university athletes (188 males, 193 females) who competed in a variety of sports. Participants completed measures of coach transformational leadership, personal and teammate inside sacrifice, and task cohesion. After conducting multilevel mediation analysis, we found that both personal and teammate inside sacrifice significantly mediated the relationships between transformational leadership behaviors and task cohesion. However, there were differential patterns of these relationships for male and female athletes. Interpretation of the results highlights that coaches should endeavor to display transformational leadership behaviors as they are related to personal and teammate inside sacrifices and task cohesion.

  12. On transform coding tools under development for VP10

    Science.gov (United States)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  13. Transform coding for hardware-accelerated volume rendering.

    Science.gov (United States)

    Fout, Nathaniel; Ma, Kwan-Liu

    2007-01-01

    Hardware-accelerated volume rendering using the GPU is now the standard approach for real-time volume rendering, although limited graphics memory can present a problem when rendering large volume data sets. Volumetric compression in which the decompression is coupled to rendering has been shown to be an effective solution to this problem; however, most existing techniques were developed in the context of software volume rendering, and all but the simplest approaches are prohibitive in a real-time hardware-accelerated volume rendering context. In this paper we present a novel block-based transform coding scheme designed specifically with real-time volume rendering in mind, such that the decompression is fast without sacrificing compression quality. This is made possible by consolidating the inverse transform with dequantization in such a way as to allow most of the reprojection to be precomputed. Furthermore, we take advantage of the freedom afforded by off-line compression in order to optimize the encoding as much as possible while hiding this complexity from the decoder. In this context we develop a new block classification scheme which allows us to preserve perceptually important features in the compression. The result of this work is an asymmetric transform coding scheme that allows very large volumes to be compressed and then decompressed in real-time while rendering on the GPU.

  14. Reward Motivation Enhances Task Coding in Frontoparietal Cortex.

    Science.gov (United States)

    Etzel, Joset A; Cole, Michael W; Zacks, Jeffrey M; Kay, Kendrick N; Braver, Todd S

    2016-04-01

    Reward motivation often enhances task performance, but the neural mechanisms underlying such cognitive enhancement remain unclear. Here, we used a multivariate pattern analysis (MVPA) approach to test the hypothesis that motivation-related enhancement of cognitive control results from improved encoding and representation of task set information. Participants underwent two fMRI sessions of cued task switching, the first under baseline conditions, and the second with randomly intermixed reward incentive and no-incentive trials. Information about the upcoming task could be successfully decoded from cue-related activation patterns in a set of frontoparietal regions typically associated with task control. More critically, MVPA classifiers trained on the baseline session had significantly higher decoding accuracy on incentive than non-incentive trials, with decoding improvement mediating reward-related enhancement of behavioral performance. These results strongly support the hypothesis that reward motivation enhances cognitive control, by improving the discriminability of task-relevant information coded and maintained in frontoparietal brain regions. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Data compression using adaptive transform coding. Appendix 1: Item 1. Ph.D. Thesis

    Science.gov (United States)

    Rost, Martin Christopher

    1988-01-01

    Adaptive low-rate source coders are described in this dissertation. These coders adapt by adjusting the complexity of the coder to match the local coding difficulty of the image. This is accomplished by using a threshold driven maximum distortion criterion to select the specific coder used. The different coders are built using variable blocksized transform techniques, and the threshold criterion selects small transform blocks to code the more difficult regions and larger blocks to code the less complex regions. A theoretical framework is constructed from which the study of these coders can be explored. An algorithm for selecting the optimal bit allocation for the quantization of transform coefficients is developed. The bit allocation algorithm is more fully developed, and can be used to achieve more accurate bit assignments than the algorithms currently used in the literature. Some upper and lower bounds for the bit-allocation distortion-rate function are developed. An obtainable distortion-rate function is developed for a particular scalar quantizer mixing method that can be used to code transform coefficients at any rate.

  16. Towards Automatic Learning of Heuristics for Mechanical Transformations of Procedural Code

    Directory of Open Access Journals (Sweden)

    Guillermo Vigueras

    2017-01-01

    Full Text Available The current trends in next-generation exascale systems go towards integrating a wide range of specialized (co-processors into traditional supercomputers. Due to the efficiency of heterogeneous systems in terms of Watts and FLOPS per surface unit, opening the access of heterogeneous platforms to a wider range of users is an important problem to be tackled. However, heterogeneous platforms limit the portability of the applications and increase development complexity due to the programming skills required. Program transformation can help make programming heterogeneous systems easier by defining a step-wise transformation process that translates a given initial code into a semantically equivalent final code, but adapted to a specific platform. Program transformation systems require the definition of efficient transformation strategies to tackle the combinatorial problem that emerges due to the large set of transformations applicable at each step of the process. In this paper we propose a machine learning-based approach to learn heuristics to define program transformation strategies. Our approach proposes a novel combination of reinforcement learning and classification methods to efficiently tackle the problems inherent to this type of systems. Preliminary results demonstrate the suitability of this approach.

  17. pix2code: Generating Code from a Graphical User Interface Screenshot

    OpenAIRE

    Beltramelli, Tony

    2017-01-01

    Transforming a graphical user interface screenshot created by a designer into computer code is a typical task conducted by a developer in order to build customized software, websites, and mobile applications. In this paper, we show that deep learning methods can be leveraged to train a model end-to-end to automatically generate code from a single input image with over 77% of accuracy for three different platforms (i.e. iOS, Android and web-based technologies).

  18. The fast decoding of Reed-Solomon codes using high-radix fermat theoretic transforms

    Science.gov (United States)

    Liu, K. Y.; Reed, I. S.; Truong, T. K.

    1976-01-01

    Fourier-like transforms over GF(F sub n), where F sub n = 2(2n) + 1 is a Fermat prime, are applied in decoding Reed-Solomon codes. It is shown that such transforms can be computed using high-radix fast Fourier transform (FFT) algorithms requiring considerably fewer multiplications than the more usual radix 2 FFT algorithm. A special 256-symbol, 16-symbol-error-correcting, Reed-Solomon (RS) code for space communication-link applications can be encoded and decoded using this high-radix FFT algorithm over GF(F sub 3).

  19. The fast decoding of Reed-Solomon codes using Fermat theoretic transforms and continued fractions

    Science.gov (United States)

    Reed, I. S.; Scholtz, R. A.; Welch, L. R.; Truong, T. K.

    1978-01-01

    It is shown that Reed-Solomon (RS) codes can be decoded by using a fast Fourier transform (FFT) algorithm over finite fields GF(F sub n), where F sub n is a Fermat prime, and continued fractions. This new transform decoding method is simpler than the standard method for RS codes. The computing time of this new decoding algorithm in software can be faster than the standard decoding method for RS codes.

  20. Block-based wavelet transform coding of mammograms with region-adaptive quantization

    Science.gov (United States)

    Moon, Nam Su; Song, Jun S.; Kwon, Musik; Kim, JongHyo; Lee, ChoongWoong

    1998-06-01

    To achieve both high compression ratio and information preserving, it is an efficient way to combine segmentation and lossy compression scheme. Microcalcification in mammogram is one of the most significant sign of early stage of breast cancer. Therefore in coding, detection and segmentation of microcalcification enable us to preserve it well by allocating more bits to it than to other regions. Segmentation of microcalcification is performed both in spatial domain and in wavelet transform domain. Peak error controllable quantization step, which is off-line designed, is suitable for medical image compression. For region-adaptive quantization, block- based wavelet transform coding is adopted and different peak- error-constrained quantizers are applied to blocks according to the segmentation result. In view of preservation of microcalcification, the proposed coding scheme shows better performance than JPEG.

  1. Variable Rate, Adaptive Transform Tree Coding Of Images

    Science.gov (United States)

    Pearlman, William A.

    1988-10-01

    A tree code, asymptotically optimal for stationary Gaussian sources and squared error distortion [2], is used to encode transforms of image sub-blocks. The variance spectrum of each sub-block is estimated and specified uniquely by a set of one-dimensional auto-regressive parameters. The expected distortion is set to a constant for each block and the rate is allowed to vary to meet the given level of distortion. Since the spectrum and rate are different for every block, the code tree differs for every block. Coding simulations for target block distortion of 15 and average block rate of 0.99 bits per pel (bpp) show that very good results can be obtained at high search intensities at the expense of high computational complexity. The results at the higher search intensities outperform a parallel simulation with quantization replacing tree coding. Comparative coding simulations also show that the reproduced image with variable block rate and average rate of 0.99 bpp has 2.5 dB less distortion than a similarly reproduced image with a constant block rate equal to 1.0 bpp.

  2. Towards Product Lining Model-Driven Development Code Generators

    OpenAIRE

    Roth, Alexander; Rumpe, Bernhard

    2015-01-01

    A code generator systematically transforms compact models to detailed code. Today, code generation is regarded as an integral part of model-driven development (MDD). Despite its relevance, the development of code generators is an inherently complex task and common methodologies and architectures are lacking. Additionally, reuse and extension of existing code generators only exist on individual parts. A systematic development and reuse based on a code generator product line is still in its inf...

  3. Reduction and coding of synthetic aperture radar data with Fourier transforms

    Science.gov (United States)

    Tilley, David G.

    1995-01-01

    Recently, aboard the Space Radar Laboratory (SRL), the two roles of Fourier Transforms for ocean image synthesis and surface wave analysis have been implemented with a dedicated radar processor to significantly reduce Synthetic Aperture Radar (SAR) ocean data before transmission to the ground. The object was to archive the SAR image spectrum, rather than the SAR image itself, to reduce data volume and capture the essential descriptors of the surface wave field. SAR signal data are usually sampled and coded in the time domain for transmission to the ground where Fourier Transforms are applied both to individual radar pulses and to long sequences of radar pulses to form two-dimensional images. High resolution images of the ocean often contain no striking features and subtle image modulations by wind generated surface waves are only apparent when large ocean regions are studied, with Fourier transforms, to reveal periodic patterns created by wind stress over the surface wave field. Major ocean currents and atmospheric instability in coastal environments are apparent as large scale modulations of SAR imagery. This paper explores the possibility of computing complex Fourier spectrum codes representing SAR images, transmitting the coded spectra to Earth for data archives and creating scenes of surface wave signatures and air-sea interactions via inverse Fourier transformations with ground station processors.

  4. Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains

    Science.gov (United States)

    Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao

    2017-11-01

    A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.

  5. How Knowledge Worker Teams Deal Effectively with Task Uncertainty: The Impact of Transformational Leadership and Group Development.

    Science.gov (United States)

    Leuteritz, Jan-Paul; Navarro, José; Berger, Rita

    2017-01-01

    The purpose of this paper is to clarify how leadership is able to improve team effectiveness, by means of its influence on group processes (i.e., increasing group development) and on the group task (i.e., decreasing task uncertainty). Four hundred and eight members of 107 teams in a German research and development (R&D) organization completed a web-based survey; they provided measures of transformational leadership, group development, 2 aspects of task uncertainty, task interdependence, and team effectiveness. In 54 of these teams, the leaders answered a web-based survey on team effectiveness. We tested the model with the data from team members, using structural equations modeling. Group development and a task uncertainty measurement that refers to unstable demands from outside the team partially mediate the effect of transformational leadership on team effectiveness in R&D organizations ( p transformational leaders reduce unclarity of goals ( p transformational leadership and team processes on team effectiveness considering the task characteristics uncertainty and interdependence.

  6. Multispectral data compression through transform coding and block quantization

    Science.gov (United States)

    Ready, P. J.; Wintz, P. A.

    1972-01-01

    Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.

  7. Systematic Luby Transform codes as incremental redundancy scheme

    CSIR Research Space (South Africa)

    Grobler, TL

    2011-09-01

    Full Text Available Transform Codes as Incremental Redundancy Scheme T. L. Grobler y, E. R. Ackermann y, J. C. Olivier y and A. J. van Zylz Department of Electrical, Electronic and Computer Engineering University of Pretoria, Pretoria 0002, South Africa Email: trienkog...@gmail.com, etienne.ackermann@ieee.org yDefence, Peace, Safety and Security (DPSS) Council for Scientific and Industrial Research (CSIR), Pretoria 0001, South Africa zDepartment of Mathematics and Applied Mathematics University of Pretoria, Pretoria 0002, South...

  8. Architecture for time or transform domain decoding of reed-solomon codes

    Science.gov (United States)

    Shao, Howard M. (Inventor); Truong, Trieu-Kie (Inventor); Hsu, In-Shek (Inventor); Deutsch, Leslie J. (Inventor)

    1989-01-01

    Two pipeline (255,233) RS decoders, one a time domain decoder and the other a transform domain decoder, use the same first part to develop an errata locator polynomial .tau.(x), and an errata evaluator polynominal A(x). Both the time domain decoder and transform domain decoder have a modified GCD that uses an input multiplexer and an output demultiplexer to reduce the number of GCD cells required. The time domain decoder uses a Chien search and polynomial evaluator on the GCD outputs .tau.(x) and A(x), for the final decoding steps, while the transform domain decoder uses a transform error pattern algorithm operating on .tau.(x) and the initial syndrome computation S(x), followed by an inverse transform algorithm in sequence for the final decoding steps prior to adding the received RS coded message to produce a decoded output message.

  9. Improved virtual channel noise model for transform domain Wyner-Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2009-01-01

    Distributed video coding (DVC) has been proposed as a new video coding paradigm to deal with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. A virtual channel noise model is utilized at the decoder to estimate...... the noise distribution between the side information frame and the original frame. This is one of the most important aspects influencing the coding performance of DVC. Noise models with different granularity have been proposed. In this paper, an improved noise model for transform domain Wyner-Ziv video...... coding is proposed, which utilizes cross-band correlation to estimate the Laplacian parameters more accurately. Experimental results show that the proposed noise model can improve the rate-distortion (RD) performance....

  10. 3D Scan-Based Wavelet Transform and Quality Control for Video Coding

    Directory of Open Access Journals (Sweden)

    Parisot Christophe

    2003-01-01

    Full Text Available Wavelet coding has been shown to achieve better compression than DCT coding and moreover allows scalability. 2D DWT can be easily extended to 3D and thus applied to video coding. However, 3D subband coding of video suffers from two drawbacks. The first is the amount of memory required for coding large 3D blocks; the second is the lack of temporal quality due to the sequence temporal splitting. In fact, 3D block-based video coders produce jerks. They appear at blocks temporal borders during video playback. In this paper, we propose a new temporal scan-based wavelet transform method for video coding combining the advantages of wavelet coding (performance, scalability with acceptable reduced memory requirements, no additional CPU complexity, and avoiding jerks. We also propose an efficient quality allocation procedure to ensure a constant quality over time.

  11. Low-Complexity Multiple Description Coding of Video Based on 3D Block Transforms

    Directory of Open Access Journals (Sweden)

    Andrey Norkin

    2007-02-01

    Full Text Available The paper presents a multiple description (MD video coder based on three-dimensional (3D transforms. Two balanced descriptions are created from a video sequence. In the encoder, video sequence is represented in a form of coarse sequence approximation (shaper included in both descriptions and residual sequence (details which is split between two descriptions. The shaper is obtained by block-wise pruned 3D-DCT. The residual sequence is coded by 3D-DCT or hybrid, LOT+DCT, 3D-transform. The coding scheme is targeted to mobile devices. It has low computational complexity and improved robustness of transmission over unreliable networks. The coder is able to work at very low redundancies. The coding scheme is simple, yet it outperforms some MD coders based on motion-compensated prediction, especially in the low-redundancy region. The margin is up to 3 dB for reconstruction from one description.

  12. Spatial Block Codes Based on Unitary Transformations Derived from Orthonormal Polynomial Sets

    Directory of Open Access Journals (Sweden)

    Mandyam Giridhar D

    2002-01-01

    Full Text Available Recent work in the development of diversity transformations for wireless systems has produced a theoretical framework for space-time block codes. Such codes are beneficial in that they may be easily concatenated with interleaved trellis codes and yet still may be decoded separately. In this paper, a theoretical framework is provided for the generation of spatial block codes of arbitrary dimensionality through the use of orthonormal polynomial sets. While these codes cannot maximize theoretical diversity performance for given dimensionality, they still provide performance improvements over the single-antenna case. In particular, their application to closed-loop transmit diversity systems is proposed, as the bandwidth necessary for feedback using these types of codes is fixed regardless of the number of antennas used. Simulation data is provided demonstrating these types of codes′ performance under this implementation as compared not only to the single-antenna case but also to the two-antenna code derived from the Radon-Hurwitz construction.

  13. Tangent: Automatic Differentiation Using Source Code Transformation in Python

    OpenAIRE

    van Merriënboer, Bart; Wiltschko, Alexander B.; Moldovan, Dan

    2017-01-01

    Automatic differentiation (AD) is an essential primitive for machine learning programming systems. Tangent is a new library that performs AD using source code transformation (SCT) in Python. It takes numeric functions written in a syntactic subset of Python and NumPy as input, and generates new Python functions which calculate a derivative. This approach to automatic differentiation is different from existing packages popular in machine learning, such as TensorFlow and Autograd. Advantages ar...

  14. Warped Discrete Cosine Transform-Based Low Bit-Rate Block Coding Using Image Downsampling

    Directory of Open Access Journals (Sweden)

    Ertürk Sarp

    2007-01-01

    Full Text Available This paper presents warped discrete cosine transform (WDCT-based low bit-rate block coding using image downsampling. While WDCT aims to improve the performance of conventional DCT by frequency warping, the WDCT has only been applicable to high bit-rate coding applications because of the overhead required to define the parameters of the warping filter. Recently, low bit-rate block coding based on image downsampling prior to block coding followed by upsampling after the decoding process is proposed to improve the compression performance for low bit-rate block coders. This paper demonstrates that a superior performance can be achieved if WDCT is used in conjunction with image downsampling-based block coding for low bit-rate applications.

  15. cDNA sequence of human transforming gene hst and identification of the coding sequence required for transforming activity

    International Nuclear Information System (INIS)

    Taira, M.; Yoshida, T.; Miyagawa, K.; Sakamoto, H.; Terada, M.; Sugimura, T.

    1987-01-01

    The hst gene was originally identified as a transforming gene in DNAs from human stomach cancers and from a noncancerous portion of stomach mucosa by DNA-mediated transfection assay using NIH3T3 cells. cDNA clones of hst were isolated from the cDNA library constructed from poly(A) + RNA of a secondary transformant induced by the DNA from a stomach cancer. The sequence analysis of the hst cDNA revealed the presence of two open reading frames. When this cDNA was inserted into an expression vector containing the simian virus 40 promoter, it efficiently induced the transformation of NIH3T3 cells upon transfection. It was found that one of the reading frames, which coded for 206 amino acids, was responsible for the transforming activity

  16. Mutiple LDPC Decoding using Bitplane Correlation for Transform Domain Wyner-Ziv Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Huang, Xin; Forchhammer, Søren

    2011-01-01

    Distributed video coding (DVC) is an emerging video coding paradigm for systems which fully or partly exploit the source statistics at the decoder to reduce the computational burden at the encoder. This paper considers a Low Density Parity Check (LDPC) based Transform Domain Wyner-Ziv (TDWZ) video...... codec. To improve the LDPC coding performance in the context of TDWZ, this paper proposes a Wyner-Ziv video codec using bitplane correlation through multiple parallel LDPC decoding. The proposed scheme utilizes inter bitplane correlation to enhance the bitplane decoding performance. Experimental results...

  17. Wavelet transform and Huffman coding based electrocardiogram compression algorithm: Application to telecardiology

    International Nuclear Information System (INIS)

    Chouakri, S A; Djaafri, O; Taleb-Ahmed, A

    2013-01-01

    We present in this work an algorithm for electrocardiogram (ECG) signal compression aimed to its transmission via telecommunication channel. Basically, the proposed ECG compression algorithm is articulated on the use of wavelet transform, leading to low/high frequency components separation, high order statistics based thresholding, using level adjusted kurtosis value, to denoise the ECG signal, and next a linear predictive coding filter is applied to the wavelet coefficients producing a lower variance signal. This latter one will be coded using the Huffman encoding yielding an optimal coding length in terms of average value of bits per sample. At the receiver end point, with the assumption of an ideal communication channel, the inverse processes are carried out namely the Huffman decoding, inverse linear predictive coding filter and inverse discrete wavelet transform leading to the estimated version of the ECG signal. The proposed ECG compression algorithm is tested upon a set of ECG records extracted from the MIT-BIH Arrhythmia Data Base including different cardiac anomalies as well as the normal ECG signal. The obtained results are evaluated in terms of compression ratio and mean square error which are, respectively, around 1:8 and 7%. Besides the numerical evaluation, the visual perception demonstrates the high quality of ECG signal restitution where the different ECG waves are recovered correctly

  18. Transformation of Graphical ECA Policies into Executable PonderTalk Code

    Science.gov (United States)

    Romeikat, Raphael; Sinsel, Markus; Bauer, Bernhard

    Rules are becoming more and more important in business modeling and systems engineering and are recognized as a high-level programming paradigma. For the effective development of rules it is desired to start at a high level, e.g. with graphical rules, and to refine them into code of a particular rule language for implementation purposes later. An model-driven approach is presented in this paper to transform graphical rules into executable code in a fully automated way. The focus is on event-condition-action policies as a special rule type. These are modeled graphically and translated into the PonderTalk language. The approach may be extended to integrate other rule types and languages as well.

  19. Adaptive discrete cosine transform coding algorithm for digital mammography

    Science.gov (United States)

    Baskurt, Atilla M.; Magnin, Isabelle E.; Goutte, Robert

    1992-09-01

    The need for storage, transmission, and archiving of medical images has led researchers to develop adaptive and efficient data compression techniques. Among medical images, x-ray radiographs of the breast are especially difficult to process because of their particularly low contrast and very fine structures. A block adaptive coding algorithm based on the discrete cosine transform to compress digitized mammograms is described. A homogeneous repartition of the degradation in the decoded images is obtained using a spatially adaptive threshold. This threshold depends on the coding error associated with each block of the image. The proposed method is tested on a limited number of pathological mammograms including opacities and microcalcifications. A comparative visual analysis is performed between the original and the decoded images. Finally, it is shown that data compression with rather high compression rates (11 to 26) is possible in the mammography field.

  20. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    Science.gov (United States)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  1. Comparative performance evaluation of transform coding in image pre-processing

    Science.gov (United States)

    Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha

    2017-07-01

    We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.

  2. PACC information management code for common cause failures analysis

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Garcia Gay, J.; Mira McWilliams, J.

    1987-01-01

    The purpose of this paper is to present the PACC code, which, through an adequate data management, makes the task of computerized common-mode failure analysis easier. PACC processes and generates information in order to carry out the corresponding qualitative analysis, by means of the boolean technique of transformation of variables, and the quantitative analysis either using one of several parametric methods or a direct data-base. As far as the qualitative analysis is concerned, the code creates several functional forms for the transformation equations according to the user's choice. These equations are subsequently processed by boolean manipulation codes, such as SETS. The quantitative calculations of the code can be carried out in two different ways: either starting from a common cause data-base, or through parametric methods, such as the Binomial Failure Rate Method, the Basic Parameters Method or the Multiple Greek Letter Method, among others. (orig.)

  3. Fast heap transform-based QR-decomposition of real and complex matrices: algorithms and codes

    Science.gov (United States)

    Grigoryan, Artyom M.

    2015-03-01

    In this paper, we describe a new look on the application of Givens rotations to the QR-decomposition problem, which is similar to the method of Householder transformations. We apply the concept of the discrete heap transform, or signal-induced unitary transforms which had been introduced by Grigoryan (2006) and used in signal and image processing. Both cases of real and complex nonsingular matrices are considered and examples of performing QR-decomposition of square matrices are given. The proposed method of QR-decomposition for the complex matrix is novel and differs from the known method of complex Givens rotation and is based on analytical equations for the heap transforms. Many examples illustrated the proposed heap transform method of QR-decomposition are given, algorithms are described in detail, and MATLAB-based codes are included.

  4. Transform domain Wyner-Ziv video coding with refinement of noise residue and side information

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2010-01-01

    are successively updating the estimated noise residue for noise modeling and side information frame quality during decoding. Experimental results show that the proposed decoder can improve the Rate- Distortion (RD) performance of a state-of-the-art Wyner Ziv video codec for the set of test sequences.......Distributed Video Coding (DVC) is a video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of side information at the decoder. This paper considers feedback channel based Transform Domain Wyner-Ziv (TDWZ) DVC. The coding efficiency of TDWZ video...... coding does not match that of conventional video coding yet, mainly due to the quality of side information and inaccurate noise estimation. In this context, a novel TDWZ video decoder with noise residue refinement (NRR) and side information refinement (SIR) is proposed. The proposed refinement schemes...

  5. Feature-selective Attention in Frontoparietal Cortex: Multivoxel Codes Adjust to Prioritize Task-relevant Information.

    Science.gov (United States)

    Jackson, Jade; Rich, Anina N; Williams, Mark A; Woolgar, Alexandra

    2017-02-01

    Human cognition is characterized by astounding flexibility, enabling us to select appropriate information according to the objectives of our current task. A circuit of frontal and parietal brain regions, often referred to as the frontoparietal attention network or multiple-demand (MD) regions, are believed to play a fundamental role in this flexibility. There is evidence that these regions dynamically adjust their responses to selectively process information that is currently relevant for behavior, as proposed by the "adaptive coding hypothesis" [Duncan, J. An adaptive coding model of neural function in prefrontal cortex. Nature Reviews Neuroscience, 2, 820-829, 2001]. Could this provide a neural mechanism for feature-selective attention, the process by which we preferentially process one feature of a stimulus over another? We used multivariate pattern analysis of fMRI data during a perceptually challenging categorization task to investigate whether the representation of visual object features in the MD regions flexibly adjusts according to task relevance. Participants were trained to categorize visually similar novel objects along two orthogonal stimulus dimensions (length/orientation) and performed short alternating blocks in which only one of these dimensions was relevant. We found that multivoxel patterns of activation in the MD regions encoded the task-relevant distinctions more strongly than the task-irrelevant distinctions: The MD regions discriminated between stimuli of different lengths when length was relevant and between the same objects according to orientation when orientation was relevant. The data suggest a flexible neural system that adjusts its representation of visual objects to preferentially encode stimulus features that are currently relevant for behavior, providing a neural mechanism for feature-selective attention.

  6. Contributions of Sensory Coding and Attentional Control to Individual Differences in Performance in Spatial Auditory Selective Attention Tasks.

    Science.gov (United States)

    Dai, Lengshi; Shinn-Cunningham, Barbara G

    2016-01-01

    Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics

  7. Contributions of sensory coding and attentional control to individual differences in performance in spatial auditory selective attention tasks

    Directory of Open Access Journals (Sweden)

    Lengshi Dai

    2016-10-01

    Full Text Available Listeners with normal hearing thresholds differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding, onset event-related potentials from the scalp (ERPs, reflecting cortical responses to sound, and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones; however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance, inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with normal hearing thresholds can arise due to both subcortical coding differences and differences in attentional control, depending on

  8. Television system in which digitised picture signals subjected to a transform coding are transmitted from an encoding station to a decoding station

    NARCIS (Netherlands)

    1989-01-01

    Television system in which digitalized picture signals subjected to a transform coding are transmitted from an encoding station to a decoding station. In a television system a digital picture signal is subjected to a transform coding for the purpose of bit rate reduction. In order to detect motion

  9. Harmonization of nuclear codes and standards, pacific nuclear council working and task group report

    International Nuclear Information System (INIS)

    Dua, S.S.

    2006-01-01

    Full text: The codes and standards, both at the national and international level, have had a major impact on the industry worldwide and served it well in maintaining the performance and safety of the nuclear reactors and facilities. The codes and standards, in general, are consensus documents and do seek public input at various levels before they are finalized and rolled out for use by the nuclear vendors, consultants, utilities and regulatory bodies. However, the extensive development of prescriptive national standards if unchecked against the global environment and trade agreements (NAFTA, WTO, etc.) can also become barriers and cause difficulties to compete in the world market. During the last decade, the national and international writing standards writing bodies have recognized these issues and are moving more towards the rationalization and harmonization of their standards with the more widely accepted generic standards. The Pacific Nuclear Council (PNC) recognized the need for harmonization of the nuclear codes and standards for its member countries and formed a Task Group to achieve its objectives. The Task Group has a number of members from the PNC member countries. In 2005 PNC further raised the importance of this activity and formed a Working Group to cover a broader scope. The Working Group (WG) mandate is to identify and analyze the different codes and standards introduced to the Pacific Basin region, in order to achieve mutual understanding, harmonization and application in each country. This o requires the WG to develop and encourage the use of reasonably consistent criteria for the design and development, engineering, procurement, fabrication, construction, testing, operations, maintenance, waste management, decommissioning and the management of the commercial nuclear power plants in the Pacific Basin so as to: Promote consistent safety, quality, environmental and management standards for nuclear energy and other peaceful applications of nuclear

  10. The Morse code effect: A crystal-crystal transformation observed in gel-grown lead (II) oxalate crystals

    Science.gov (United States)

    Lisgarten, J. N.; Marks, J. A.

    2018-05-01

    This paper reports on an unusual crystal-crystal transformation phenomenon, which we have called the Morse Code Effect, based on the change in appearance of lead(II) oxalate crystals grown in agarose gels.

  11. Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding

    Science.gov (United States)

    Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.

    1977-01-01

    An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.

  12. Design implications for task-specific search utilities for retrieval and re-engineering of code

    Science.gov (United States)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  13. Towards Sustaining Levels of Reflective Learning: How Do Transformational Leadership, Task Interdependence, and Self-Efficacy Shape Teacher Learning in Schools?

    Directory of Open Access Journals (Sweden)

    Arnoud Oude Groote Beverborg

    2015-03-01

    Full Text Available Whereas cross-sectional research has shown that transformational leadership, task interdependence, and self-efficacy are positively related to teachers’ engagement in reflective learning activities, the causal direction of these relations needs further inquiry. At the same time, individual teacher learning might play a mutual role in strengthening school-level capacity for sustained improvement. Building on previous research, this longitudinal study therefore examines how transformational leadership, task interdependence, self-efficacy, and teachers’ engagement in self-reflection mutually affect each other over time. Questionnaire data gathered on three measurement occasions from 655 Dutch Vocational Education and Training teachers was analyzed using a multivariate Latent Difference Score model. Results indicate that self-reflection and task interdependence reciprocally influence each other’s change. A considerate and stimulating transformational leader was found to contribute to this process. Change in self-efficacy was influenced by self-reflection, indicating that learning leads to competency beliefs. Together, the findings point to the important role transformational leadership practices play in facilitating teamwork, and sustaining teachers’ levels of learning in schools.

  14. Summary report for ITER task - D10: Update and implementation of neutron transport and activation codes and processed libraries

    International Nuclear Information System (INIS)

    Attaya, H.

    1995-01-01

    The primary goal of this task is to provide the capabilities in the activation code RACC, to treat pulsed operation modes. In addition, it is required that the code utilizes the same spatial mesh and geometrical models as employed in the one or multidimensional neutron transport codes used in ITER design. This would ensure the use of the same neutron flux generated by those codes to calculate the different activation parameters. It is also required to have the capabilities for generating graphical outputs for the calculated activation parameters

  15. Discrete Ramanujan transform for distinguishing the protein coding regions from other regions.

    Science.gov (United States)

    Hua, Wei; Wang, Jiasong; Zhao, Jian

    2014-01-01

    Based on the study of Ramanujan sum and Ramanujan coefficient, this paper suggests the concepts of discrete Ramanujan transform and spectrum. Using Voss numerical representation, one maps a symbolic DNA strand as a numerical DNA sequence, and deduces the discrete Ramanujan spectrum of the numerical DNA sequence. It is well known that of discrete Fourier power spectrum of protein coding sequence has an important feature of 3-base periodicity, which is widely used for DNA sequence analysis by the technique of discrete Fourier transform. It is performed by testing the signal-to-noise ratio at frequency N/3 as a criterion for the analysis, where N is the length of the sequence. The results presented in this paper show that the property of 3-base periodicity can be only identified as a prominent spike of the discrete Ramanujan spectrum at period 3 for the protein coding regions. The signal-to-noise ratio for discrete Ramanujan spectrum is defined for numerical measurement. Therefore, the discrete Ramanujan spectrum and the signal-to-noise ratio of a DNA sequence can be used for distinguishing the protein coding regions from the noncoding regions. All the exon and intron sequences in whole chromosomes 1, 2, 3 and 4 of Caenorhabditis elegans have been tested and the histograms and tables from the computational results illustrate the reliability of our method. In addition, we have analyzed theoretically and gotten the conclusion that the algorithm for calculating discrete Ramanujan spectrum owns the lower computational complexity and higher computational accuracy. The computational experiments show that the technique by using discrete Ramanujan spectrum for classifying different DNA sequences is a fast and effective method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. A Complete Video Coding Chain Based on Multi-Dimensional Discrete Cosine Transform

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2010-09-01

    Full Text Available The paper deals with a video compression method based on the multi-dimensional discrete cosine transform. In the text, the encoder and decoder architectures including the definitions of all mathematical operations like the forward and inverse 3-D DCT, quantization and thresholding are presented. According to the particular number of currently processed pictures, the new quantization tables and entropy code dictionaries are proposed in the paper. The practical properties of the 3-D DCT coding chain compared with the modern video compression methods (such as H.264 and WebM and the computing complexity are presented as well. It will be proved the best compress properties could be achieved by complex H.264 codec. On the other hand the computing complexity - especially on the encoding side - is lower for the 3-D DCT method.

  17. Dynamic spatial coding within the dorsal frontoparietal network during a visual search task.

    Directory of Open Access Journals (Sweden)

    Wieland H Sommer

    Full Text Available To what extent are the left and right visual hemifields spatially coded in the dorsal frontoparietal attention network? In many experiments with neglect patients, the left hemisphere shows a contralateral hemifield preference, whereas the right hemisphere represents both hemifields. This pattern of spatial coding is often used to explain the right-hemispheric dominance of lesions causing hemispatial neglect. However, pathophysiological mechanisms of hemispatial neglect are controversial because recent experiments on healthy subjects produced conflicting results regarding the spatial coding of visual hemifields. We used an fMRI paradigm that allowed us to distinguish two attentional subprocesses during a visual search task. Either within the left or right hemifield subjects first attended to stationary locations (spatial orienting and then shifted their attentional focus to search for a target line. Dynamic changes in spatial coding of the left and right hemifields were observed within subregions of the dorsal front-parietal network: During stationary spatial orienting, we found the well-known spatial pattern described above, with a bilateral hemifield representation in the right hemisphere and a contralateral preference in the left hemisphere. However, during search, the right hemisphere had a contralateral preference and the left hemisphere equally represented both hemifields. This finding leads to novel perspectives regarding models of visuospatial attention and hemispatial neglect.

  18. FFT-BM, Code Accuracy Evaluations with the 1D Fast Fourier Transform (FFT) Methodology

    International Nuclear Information System (INIS)

    D'Auria, F.

    2004-01-01

    1 - Description of program or function: FFT-BM is an integrated version of the programs package performing code accuracy evaluations with the 1D Fast Fourier Transform (FFT) methodology. It contains two programs: - CASEM: Takes care of the complete manipulation of data in order to evaluate the quantities through which the FFT method quantifies the code accuracy. - AAWFTO completes the evaluation of the average accuracy (AA) and related weighted frequency (WF) values in order to obtain the AAtot and WFtot values characterising the global calculation performance. 2 - Methods: The Fast Fourier Transform, or FFT, which is based on the Fourier analysis method is an optimised method for calculating the amplitude Vs frequency, of functions or experimental or computed data. In order to apply this methodology, after selecting the parameters to be analyzed, it is necessary to choose the following parameters: - number of curves (exp + calc) to be analyzed; - number of time windows to be analyzed; - sampling frequency; - cut frequency; - time begin and time end of each time window. 3 - Restrictions on the complexity of the problem: Up to 30 curves (exp + calc) and 5 time windows may be analyzed

  19. A code to compute the action-angle transformation for a particle in an abritrary potential well

    International Nuclear Information System (INIS)

    Berg, J.S.; Warnock, R.L.

    1995-01-01

    For a Vlasov treatment of longitudinal stability under an arbitrary wake field, with the solution of the Haiessinski equation as the unperturbed distribution, it is important to have the action-angle transformation for the distorted potential well in a convenient form. The authors have written a code that gives the transformation q,p → J, φ, with q(J,φ) as a Fourier series in φ, the Fourier coefficients and the Hamiltonian H(J) being spline functions of J in C 2 (having continuous second derivatives)

  20. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  1. Harmonization of nuclear codes and standards. Pacific nuclear council working and task group report

    International Nuclear Information System (INIS)

    Dua, Shami

    2008-01-01

    Nuclear codes and standards have been an integral part of the nuclear industry since its inception. As the industry came into the main stream over the 2nd half of the 20th century, a number of national and international standards were developed to support a specific nuclear reactor concept. These codes and standards have been a key component of the industry to maintain its focus on nuclear safety, reliability and quality. Both national and international standards have served the industry well in obtaining public, shareholder and regulatory acceptance. The existing suite of national and international standards is required to support the emerging nuclear renaissance. However as noted above under Pacific Nuclear Council (PNC), Manufacturing Design Evaluation Program (MDEP) and SMiRT discussions, the time has come now for the codes and standards writing bodies and the industry to take the next step to examine the relevance of existing suite in view of current needs and challenges. This review must account for the changing global environment including global supply chain and regulatory framework, resources, deregulation, free trade, and industry need for competitiveness and performance excellence. The Task Group (TG) has made limited progress in this review period as no additional information on the listing of codes and standards have been received from the members. However, TG Chair has been successful in obtaining considerable interest from some additional individuals from the member countries. It is important that PNC management seek additional participation from the member countries and asks for their active engagement in the Working Group (WG) TG activities to achieve its mandate and deliverables. The harmonization of codes and standards is a key area for the emerging nuclear renaissance and as noted by a number of international organizations (refer to MDEP action noted above) that these tasks can not be completed unless we have the right level of resources and

  2. Evaluating the ONEBFP transport code for possible use in the proton radiography program. Final report, Task 47

    International Nuclear Information System (INIS)

    Marr, D.R.; Prael, R.E.; Adams, K.J.

    1996-10-01

    This is notification of the completion of Task 47 and a summary of the fulfillment of the requirements thereof. Deliverables for Task 47 include the data test files and a final report. The test files have been delivered to the customer and the attached paper satisfies the requirements for a final report. Detail on the completion of each of the subtasks described in the Statement of Work follow. The author repeats the complete list of subtasks for Task 47: (1) The software engineer will modify the ONEBFP code to generate a logarithmic distribution of discrete angles and an associated set of quadrature weights; (2) The software engineer will work with Group XTM personnel to obtain the required cross-section data for protons/nuclear cascade particles; and (3) The software engineer will perform 5 test calculations using the modified ONEBFP code to assess its accuracy and efficiency for proton transport problems. The test calculations will be documented in a brief report. Appendix C of the paper describes the quadrature set capability installed in the ONEBFP code pertinent to the fulfillment of subtask 1. A portion of the body of the paper describes the source and modeling and Appendix A describes the extraction of the cross section data used in this study, fulfilling subtask 2. The bulk of the attached report describes the test problems, states the modeling used for each problem, shows the results in both graphical and tabular form, and discusses the implications of the results. This fulfills the requirements of subtask 3

  3. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L. [Oak Ridge National Lab., TN (United States); Sartori, E. [OCDE/OECD NEA Data Bank, Issy-les-Moulineaux (France); Viedma, L.G. de [Consejo de Seguridad Nuclear, Madrid (Spain)

    1997-06-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee`s Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community`s computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management.

  4. An overview of the activities of the OECD/NEA Task Force on adapting computer codes in nuclear applications to parallel architectures

    International Nuclear Information System (INIS)

    Kirk, B.L.; Sartori, E.; Viedma, L.G. de

    1997-01-01

    Subsequent to the introduction of High Performance Computing in the developed countries, the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) created the Task Force on Adapting Computer Codes in Nuclear Applications to Parallel Architectures (under the guidance of the Nuclear Science Committee's Working Party on Advanced Computing) to study the growth area in supercomputing and its applicability to the nuclear community's computer codes. The result has been four years of investigation for the Task Force in different subject fields - deterministic and Monte Carlo radiation transport, computational mechanics and fluid dynamics, nuclear safety, atmospheric models and waste management

  5. Interframe transform coding of picture data

    Science.gov (United States)

    Ahmed, N.; Natarajan, T. R.

    1976-01-01

    This semi-tutorial paper describes the process of using orthogonal transforms for the purposes of encoding TV picture data. Results pertaining to a 6:1 data compression experiment using the Walsh-Hadamard transform are included.

  6. Offshore Code Comparison Collaboration within IEA Wind Task 23: Phase IV Results Regarding Floating Wind Turbine Modeling; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J.; Larsen, T.; Hansen, A.; Nygaard, T.; Maus, K.; Karimirad, M.; Gao, Z.; Moan, T.; Fylling, I.

    2010-04-01

    Offshore wind turbines are designed and analyzed using comprehensive simulation codes that account for the coupled dynamics of the wind inflow, aerodynamics, elasticity, and controls of the turbine, along with the incident waves, sea current, hydrodynamics, and foundation dynamics of the support structure. This paper describes the latest findings of the code-to-code verification activities of the Offshore Code Comparison Collaboration, which operates under Subtask 2 of the International Energy Agency Wind Task 23. In the latest phase of the project, participants used an assortment of codes to model the coupled dynamic response of a 5-MW wind turbine installed on a floating spar buoy in 320 m of water. Code predictions were compared from load-case simulations selected to test different model features. The comparisons have resulted in a greater understanding of offshore floating wind turbine dynamics and modeling techniques, and better knowledge of the validity of various approximations. The lessons learned from this exercise have improved the participants' codes, thus improving the standard of offshore wind turbine modeling.

  7. Optimized nonorthogonal transforms for image compression.

    Science.gov (United States)

    Guleryuz, O G; Orchard, M T

    1997-01-01

    The transform coding of images is analyzed from a common standpoint in order to generate a framework for the design of optimal transforms. It is argued that all transform coders are alike in the way they manipulate the data structure formed by transform coefficients. A general energy compaction measure is proposed to generate optimized transforms with desirable characteristics particularly suited to the simple transform coding operation of scalar quantization and entropy coding. It is shown that the optimal linear decoder (inverse transform) must be an optimal linear estimator, independent of the structure of the transform generating the coefficients. A formulation that sequentially optimizes the transforms is presented, and design equations and algorithms for its computation provided. The properties of the resulting transform systems are investigated. In particular, it is shown that the resulting basis are nonorthogonal and complete, producing energy compaction optimized, decorrelated transform coefficients. Quantization issues related to nonorthogonal expansion coefficients are addressed with a simple, efficient algorithm. Two implementations are discussed, and image coding examples are given. It is shown that the proposed design framework results in systems with superior energy compaction properties and excellent coding results.

  8. Task-set inertia and memory-consolidation bottleneck in dual tasks.

    Science.gov (United States)

    Koch, Iring; Rumiati, Raffaella I

    2006-11-01

    Three dual-task experiments examined the influence of processing a briefly presented visual object for deferred verbal report on performance in an unrelated auditory-manual reaction time (RT) task. RT was increased at short stimulus-onset asynchronies (SOAs) relative to long SOAs, showing that memory consolidation processes can produce a functional processing bottleneck in dual-task performance. In addition, the experiments manipulated the spatial compatibility of the orientation of the visual object and the side of the speeded manual response. This cross-task compatibility produced relative RT benefits only when the instruction for the visual task emphasized overlap at the level of response codes across the task sets (Experiment 1). However, once the effective task set was in place, it continued to produce cross-task compatibility effects even in single-task situations ("ignore" trials in Experiment 2) and when instructions for the visual task did not explicitly require spatial coding of object orientation (Experiment 3). Taken together, the data suggest a considerable degree of task-set inertia in dual-task performance, which is also reinforced by finding costs of switching task sequences (e.g., AC --> BC vs. BC --> BC) in Experiment 3.

  9. A systematic literature review of automated clinical coding and classification systems.

    Science.gov (United States)

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  10. Novel Polynomial Basis with Fast Fourier Transform and Its Application to Reed-Solomon Erasure Codes

    KAUST Repository

    Lin, Sian-Jheng

    2016-09-13

    In this paper, we present a fast Fourier transform (FFT) algorithm over extension binary fields, where the polynomial is represented in a non-standard basis. The proposed Fourier-like transform requires O(h lg(h)) field operations, where h is the number of evaluation points. Based on the proposed Fourier-like algorithm, we then develop the encoding/ decoding algorithms for (n = 2m; k) Reed-Solomon erasure codes. The proposed encoding/erasure decoding algorithm requires O(n lg(n)), in both additive and multiplicative complexities. As the complexity leading factor is small, the proposed algorithms are advantageous in practical applications. Finally, the approaches to convert the basis between the monomial basis and the new basis are proposed.

  11. Modification of the fast fourier transform-based method by signal mirroring for accuracy quantification of thermal-hydraulic system code

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Tae Wook; Jeong, Jae Jun [School of Mechanical Engineering, Pusan National University, Busan (Korea, Republic of); Choi, Ki Yong [Korea Atomic Energy Research Institute (KAERI), Daejeon (Korea, Republic of)

    2017-08-15

    A thermal–hydraulic system code is an essential tool for the design and safety analysis of a nuclear power plant, and its accuracy quantification is very important for the code assessment and applications. The fast Fourier transform-based method (FFTBM) by signal mirroring (FFTBM-SM) has been used to quantify the accuracy of a system code by using a comparison of the experimental data and the calculated results. The method is an improved version of the FFTBM, and it is known that the FFTBM-SM judges the code accuracy in a more consistent and unbiased way. However, in some applications, unrealistic results have been obtained. In this study, it was found that accuracy quantification by FFTBM-SM is dependent on the frequency spectrum of the fast Fourier transform of experimental and error signals. The primary objective of this study is to reduce the frequency dependency of FFTBM-SM evaluation. For this, it was proposed to reduce the cut off frequency, which was introduced to cut off spurious contributions, in FFTBM-SM. A method to determine an appropriate cut off frequency was also proposed. The FFTBM-SM with the modified cut off frequency showed a significant improvement of the accuracy quantification.

  12. Modification of the fast fourier transform-based method by signal mirroring for accuracy quantification of thermal-hydraulic system code

    International Nuclear Information System (INIS)

    Ha, Tae Wook; Jeong, Jae Jun; Choi, Ki Yong

    2017-01-01

    A thermal–hydraulic system code is an essential tool for the design and safety analysis of a nuclear power plant, and its accuracy quantification is very important for the code assessment and applications. The fast Fourier transform-based method (FFTBM) by signal mirroring (FFTBM-SM) has been used to quantify the accuracy of a system code by using a comparison of the experimental data and the calculated results. The method is an improved version of the FFTBM, and it is known that the FFTBM-SM judges the code accuracy in a more consistent and unbiased way. However, in some applications, unrealistic results have been obtained. In this study, it was found that accuracy quantification by FFTBM-SM is dependent on the frequency spectrum of the fast Fourier transform of experimental and error signals. The primary objective of this study is to reduce the frequency dependency of FFTBM-SM evaluation. For this, it was proposed to reduce the cut off frequency, which was introduced to cut off spurious contributions, in FFTBM-SM. A method to determine an appropriate cut off frequency was also proposed. The FFTBM-SM with the modified cut off frequency showed a significant improvement of the accuracy quantification

  13. Information security using multiple reference-based optical joint transform correlation and orthogonal code

    Science.gov (United States)

    Nazrul Islam, Mohammed; Karim, Mohammad A.; Vijayan Asari, K.

    2013-09-01

    Protecting and processing of confidential information, such as personal identification, biometrics, remains a challenging task for further research and development. A new methodology to ensure enhanced security of information in images through the use of encryption and multiplexing is proposed in this paper. We use orthogonal encoding scheme to encode multiple information independently and then combine them together to save storage space and transmission bandwidth. The encoded and multiplexed image is encrypted employing multiple reference-based joint transform correlation. The encryption key is fed into four channels which are relatively phase shifted by different amounts. The input image is introduced to all the channels and then Fourier transformed to obtain joint power spectra (JPS) signals. The resultant JPS signals are again phase-shifted and then combined to form a modified JPS signal which yields the encrypted image after having performed an inverse Fourier transformation. The proposed cryptographic system makes the confidential information absolutely inaccessible to any unauthorized intruder, while allows for the retrieval of the information to the respective authorized recipient without any distortion. The proposed technique is investigated through computer simulations under different practical conditions in order to verify its overall robustness.

  14. Code of Ethics for the American Association of Physicists in Medicine: report of Task Group 109.

    Science.gov (United States)

    Serago, Christopher F; Adnani, Nabil; Bank, Morris I; BenComo, Jose A; Duan, Jun; Fairobent, Lynne; Freedman, D Jay; Halvorsen, Per H; Hendee, William R; Herman, Michael G; Morse, Richard K; Mower, Herbert W; Pfeiffer, Douglas E; Root, William J; Sherouse, George W; Vossler, Matthew K; Wallace, Robert E; Walters, Barbara

    2009-01-01

    A comprehensive Code of Ethics for the members of the American Association of Physicists in Medicine (AAPM) is presented as the report of Task Group 109 which consolidates previous AAPM ethics policies into a unified document. The membership of the AAPM is increasingly diverse. Prior existing AAPM ethics polices were applicable specifically to medical physicists, and did not encompass other types of members such as health physicists, regulators, corporate affiliates, physicians, scientists, engineers, those in training, or other health care professionals. Prior AAPM ethics policies did not specifically address research, education, or business ethics. The Ethics Guidelines of this new Code of Ethics have four major sections: professional conduct, research ethics, education ethics, and business ethics. Some elements of each major section may be duplicated in other sections, so that readers interested in a particular aspect of the code do not need to read the entire document for all relevant information. The prior Complaint Procedure has also been incorporated into this Code of Ethics. This Code of Ethics (PP 24-A) replaces the following AAPM policies: Ethical Guidelines for Vacating a Position (PP 4-B); Ethical Guidelines for Reviewing the Work of Another Physicist (PP 5-C); Guidelines for Ethical Practice for Medical Physicists (PP 8-D); and Ethics Complaint Procedure (PP 21-A). The AAPM Board of Directors approved this Code or Ethics on July 31, 2008.

  15. Offshore code comparison collaboration continuation within IEA Wind Task 30: Phase II results regarding a floating semisubmersible wind system

    DEFF Research Database (Denmark)

    Robertson, Amy; Jonkman, Jason M.; Vorpahl, Fabian

    2014-01-01

    Offshore wind turbines are designed and analyzed using comprehensive simulation tools (or codes) that account for the coupled dynamics of the wind inflow, aerodynamics, elasticity, and controls of the turbine, along with the incident waves, sea current, hydrodynamics, mooring dynamics, and founda......Offshore wind turbines are designed and analyzed using comprehensive simulation tools (or codes) that account for the coupled dynamics of the wind inflow, aerodynamics, elasticity, and controls of the turbine, along with the incident waves, sea current, hydrodynamics, mooring dynamics......, and foundation dynamics of the support structure. This paper describes the latest findings of the code-to-code verification activities of the Offshore Code Comparison Collaboration Continuation project, which operates under the International Energy Agency Wind Task 30. In the latest phase of the project......, participants used an assortment of simulation codes to model the coupled dynamic response of a 5-MW wind turbine installed on a floating semisubmersible in 200 m of water. Code predictions were compared from load case simulations selected to test different model features. The comparisons have resulted...

  16. Task representation in individual and joint settings

    Directory of Open Access Journals (Sweden)

    Wolfgang ePrinz

    2015-05-01

    Full Text Available This paper outlines a framework for task representation and discusses applications to interference tasks in individual and joint settings. The framework is derived from the Theory of Event Coding. This theory regards task sets as transient assemblies of event codes in which stimulus and response codes interact and shape each other in particular ways. On the one hand, stimulus and response codes compete with each other within their respective subsets (horizontal interactions. On the other hand, stimulus and response code cooperate with each other (vertical interactions. Code interactions instantiating competition and cooperation apply to two time scales: on-line performance (i.e., doing the task and off-line implementation (i.e., setting the task. Interference arises when stimulus and response codes overlap in features that are irrelevant for stimulus identification, but relevant for response selection. To resolve this dilemma, the feature profiles of event codes may become restructured in various ways. The framework is applied to three kinds of interference paradigms. Special emphasis is given to joint settings where tasks are shared between two participants. Major conclusions derived from these applications include: (1 Response competition is the chief driver of interference. Likewise, different modes of response competition give rise to different patterns of interference. (2 The type of features in which stimulus and response codes overlap is also a crucial factor. Different types of such features give likewise rise to different patterns of interference. (3 Task sets for joint settings conflate intraindividual conflicts between responses (what, with interindividual conflicts between responding agents (whom. Features of response codes may, therefore, not only address responses, but also responding agents (both physically and socially.

  17. Task representation in individual and joint settings

    Science.gov (United States)

    Prinz, Wolfgang

    2015-01-01

    This paper outlines a framework for task representation and discusses applications to interference tasks in individual and joint settings. The framework is derived from the Theory of Event Coding (TEC). This theory regards task sets as transient assemblies of event codes in which stimulus and response codes interact and shape each other in particular ways. On the one hand, stimulus and response codes compete with each other within their respective subsets (horizontal interactions). On the other hand, stimulus and response code cooperate with each other (vertical interactions). Code interactions instantiating competition and cooperation apply to two time scales: on-line performance (i.e., doing the task) and off-line implementation (i.e., setting the task). Interference arises when stimulus and response codes overlap in features that are irrelevant for stimulus identification, but relevant for response selection. To resolve this dilemma, the feature profiles of event codes may become restructured in various ways. The framework is applied to three kinds of interference paradigms. Special emphasis is given to joint settings where tasks are shared between two participants. Major conclusions derived from these applications include: (1) Response competition is the chief driver of interference. Likewise, different modes of response competition give rise to different patterns of interference; (2) The type of features in which stimulus and response codes overlap is also a crucial factor. Different types of such features give likewise rise to different patterns of interference; and (3) Task sets for joint settings conflate intraindividual conflicts between responses (what), with interindividual conflicts between responding agents (whom). Features of response codes may, therefore, not only address responses, but also responding agents (both physically and socially). PMID:26029085

  18. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  19. Television system in which digitised picture signals subjected to a transform coding are transmitted from an encoding station to a decoding station

    NARCIS (Netherlands)

    1987-01-01

    In a television system a digital picture signal is subjected to a transform coding for the purpose of bit rate reduction. In order to detect motion effects between the two fields of a picture, these fields are also examined in a motion detector 8310. If no motion is detected, intraframe transform is

  20. An Integrated Model of Cognitive Control in Task Switching

    Science.gov (United States)

    Altmann, Erik M.; Gray, Wayne D.

    2008-01-01

    A model of cognitive control in task switching is developed in which controlled performance depends on the system maintaining access to a code in episodic memory representing the most recently cued task. The main constraint on access to the current task code is proactive interference from old task codes. This interference and the mechanisms that…

  1. New binary linear codes which are dual transforms of good codes

    NARCIS (Netherlands)

    Jaffe, D.B.; Simonis, J.

    1999-01-01

    If C is a binary linear code, one may choose a subset S of C, and form a new code CST which is the row space of the matrix having the elements of S as its columns. One way of picking S is to choose a subgroup H of Aut(C) and let S be some H-stable subset of C. Using (primarily) this method for

  2. ZENO: N-body and SPH Simulation Codes

    Science.gov (United States)

    Barnes, Joshua E.

    2011-02-01

    The ZENO software package integrates N-body and SPH simulation codes with a large array of programs to generate initial conditions and analyze numerical simulations. Written in C, the ZENO system is portable between Mac, Linux, and Unix platforms. It is in active use at the Institute for Astronomy (IfA), at NRAO, and possibly elsewhere. Zeno programs can perform a wide range of simulation and analysis tasks. While many of these programs were first created for specific projects, they embody algorithms of general applicability and embrace a modular design strategy, so existing code is easily applied to new tasks. Major elements of the system include: Structured data file utilities facilitate basic operations on binary data, including import/export of ZENO data to other systems.Snapshot generation routines create particle distributions with various properties. Systems with user-specified density profiles can be realized in collisionless or gaseous form; multiple spherical and disk components may be set up in mutual equilibrium.Snapshot manipulation routines permit the user to sift, sort, and combine particle arrays, translate and rotate particle configurations, and assign new values to data fields associated with each particle.Simulation codes include both pure N-body and combined N-body/SPH programs: Pure N-body codes are available in both uniprocessor and parallel versions.SPH codes offer a wide range of options for gas physics, including isothermal, adiabatic, and radiating models. Snapshot analysis programs calculate temporal averages, evaluate particle statistics, measure shapes and density profiles, compute kinematic properties, and identify and track objects in particle distributions.Visualization programs generate interactive displays and produce still images and videos of particle distributions; the user may specify arbitrary color schemes and viewing transformations.

  3. Entanglement-assisted quantum MDS codes from negacyclic codes

    Science.gov (United States)

    Lu, Liangdong; Li, Ruihu; Guo, Luobin; Ma, Yuena; Liu, Yang

    2018-03-01

    The entanglement-assisted formalism generalizes the standard stabilizer formalism, which can transform arbitrary classical linear codes into entanglement-assisted quantum error-correcting codes (EAQECCs) by using pre-shared entanglement between the sender and the receiver. In this work, we construct six classes of q-ary entanglement-assisted quantum MDS (EAQMDS) codes based on classical negacyclic MDS codes by exploiting two or more pre-shared maximally entangled states. We show that two of these six classes q-ary EAQMDS have minimum distance more larger than q+1. Most of these q-ary EAQMDS codes are new in the sense that their parameters are not covered by the codes available in the literature.

  4. Color coding of televised task elements in remote work: a literature review with practical recommendations for a fuel reprocessing facility

    International Nuclear Information System (INIS)

    Clarke, M.M.; Preston-Anderson, A.

    1981-11-01

    The experimental literature on the effects of color visual displays was reviewed with particular reference to the performance of remote work in a Hot Experimental Facility (HEF) using real scene closed-circult television systems. It was also reviewed with more general reference to the broader range of work-related issues of operator learning and preference, and display specifications. Color has been shown to enhance the performance of tasks requiring search and location and may also enhance tracking/transportation tasks. However, both HEF large-volume searching and tracking can be computer augmented, alleviating some of the necessity for a color code to assist an operator. Although color enhances long-term memory and is preferred to black and white displays, it has not been shown to have a specific advantage in the performance of unique tasks (where computer augmentation is more problematic and visual input to operator is critical). Practical display specifications are discussed with reference to hue and size of color code, target size, ambient illumination, multiple displays, and coatings. The authors conclude that the disadvantages to color television in the HEF far outweigh any possible advantages and recommend the use of high-resolution black and white systems, unless future experiments unequivocally indicate that (1) color is superior to black and white for in-situ task performance or (2) it is imperative in terms of long-range psychological well-being

  5. Second-order statistics of colour codes modulate transformations that effectuate varying degrees of scene invariance and illumination invariance.

    Science.gov (United States)

    Mausfeld, Rainer; Andres, Johannes

    2002-01-01

    We argue, from an ethology-inspired perspective, that the internal concepts 'surface colours' and 'illumination colours' are part of the data format of two different representational primitives. Thus, the internal concept of 'colour' is not a unitary one but rather refers to two different types of 'data structure', each with its own proprietary types of parameters and relations. The relation of these representational structures is modulated by a class of parameterised transformations whose effects are mirrored in the idealised computational achievements of illumination invariance of colour codes, on the one hand, and scene invariance, on the other hand. Because the same characteristics of a light array reaching the eye can be physically produced in many different ways, the visual system, then, has to make an 'inference' whether a chromatic deviation of the space-averaged colour codes from the neutral point is due to a 'non-normal', ie chromatic, illumination or due to an imbalanced spectral reflectance composition. We provide evidence that the visual system uses second-order statistics of chromatic codes of a single view of a scene in order to modulate corresponding transformations. In our experiments we used centre surround configurations with inhomogeneous surrounds given by a random structure of overlapping circles, referred to as Seurat configurations. Each family of surrounds has a fixed space-average of colour codes, but differs with respect to the covariance matrix of colour codes of pixels that defines the chromatic variance along some chromatic axis and the covariance between luminance and chromatic channels. We found that dominant wavelengths of red-green equilibrium settings of the infield exhibited a stable and strong dependence on the chromatic variance of the surround. High variances resulted in a tendency towards 'scene invariance', low variances in a tendency towards 'illumination invariance' of the infield.

  6. LOW COMPLEXITY HYBRID LOSSY TO LOSSLESS IMAGE CODER WITH COMBINED ORTHOGONAL POLYNOMIALS TRANSFORM AND INTEGER WAVELET TRANSFORM

    Directory of Open Access Journals (Sweden)

    R. Krishnamoorthy

    2012-05-01

    Full Text Available In this paper, a new lossy to lossless image coding scheme combined with Orthogonal Polynomials Transform and Integer Wavelet Transform is proposed. The Lifting Scheme based Integer Wavelet Transform (LS-IWT is first applied on the image in order to reduce the blocking artifact and memory demand. The Embedded Zero tree Wavelet (EZW subband coding algorithm is used in this proposed work for progressive image coding which achieves efficient bit rate reduction. The computational complexity of lower subband coding of EZW algorithm is reduced in this proposed work with a new integer based Orthogonal Polynomials transform coding. The normalization and mapping are done on the subband of the image for exploiting the subjective redundancy and the zero tree structure is obtained for EZW coding and so the computation complexity is greatly reduced in this proposed work. The experimental results of the proposed technique also show that the efficient bit rate reduction is achieved for both lossy and lossless compression when compared with existing techniques.

  7. Scalable-to-lossless transform domain distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Ukhanova, Ann; Veselov, Anton

    2010-01-01

    Distributed video coding (DVC) is a novel approach providing new features as low complexity encoding by mainly exploiting the source statistics at the decoder based on the availability of decoder side information. In this paper, scalable-tolossless DVC is presented based on extending a lossy Tran...... codec provides frame by frame encoding. Comparing the lossless coding efficiency, the proposed scalable-to-lossless TDWZ video codec can save up to 5%-13% bits compared to JPEG LS and H.264 Intra frame lossless coding and do so as a scalable-to-lossless coding....

  8. Orthogonal transformations for change detection, Matlab code

    DEFF Research Database (Denmark)

    2005-01-01

    Matlab code to do multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data.......Matlab code to do multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data....

  9. An Effective Transform Unit Size Decision Method for High Efficiency Video Coding

    Directory of Open Access Journals (Sweden)

    Chou-Chen Wang

    2014-01-01

    Full Text Available High efficiency video coding (HEVC is the latest video coding standard. HEVC can achieve higher compression performance than previous standards, such as MPEG-4, H.263, and H.264/AVC. However, HEVC requires enormous computational complexity in encoding process due to quadtree structure. In order to reduce the computational burden of HEVC encoder, an early transform unit (TU decision algorithm (ETDA is adopted to pruning the residual quadtree (RQT at early stage based on the number of nonzero DCT coefficients (called NNZ-EDTA to accelerate the encoding process. However, the NNZ-ETDA cannot effectively reduce the computational load for sequences with active motion or rich texture. Therefore, in order to further improve the performance of NNZ-ETDA, we propose an adaptive RQT-depth decision for NNZ-ETDA (called ARD-NNZ-ETDA by exploiting the characteristics of high temporal-spatial correlation that exist in nature video sequences. Simulation results show that the proposed method can achieve time improving ratio (TIR about 61.26%~81.48% when compared to the HEVC test model 8.1 (HM 8.1 with insignificant loss of image quality. Compared with the NNZ-ETDA, the proposed method can further achieve an average TIR about 8.29%~17.92%.

  10. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  11. 3-D spherical harmonics code FFT3 by the finite Fourier transformation method

    International Nuclear Information System (INIS)

    Kobayashi, K.

    1997-01-01

    In the odd order spherical harmonics method, the rigorous boundary condition at the material interfaces is that the even moments of the angular flux and the normal components of the even order moments of current vectors must be continuous. However, it is difficult to derive spatial discretized equations by the finite difference or finite element methods, which satisfy this material interface condition. It is shown that using the finite Fourier transformation method, space discretized equations which satisfy this interface condition can be easily derived. The discrepancies of the flux distribution near void region between spherical harmonics method codes may be due to the difference of application of the material interface condition. (author)

  12. Imperative-program transformation by instrumented-interpreter specialization

    DEFF Research Database (Denmark)

    Debois, Søren

    2008-01-01

    We describe how to implement strength reduction, loop-invariant code motion and loop quasi-invariant code motion by specializing instrumented interpreters. To curb code duplication intrinsic to such specialization, we introduce a new program transformation, rewinding, which uses Moore-automata mi......We describe how to implement strength reduction, loop-invariant code motion and loop quasi-invariant code motion by specializing instrumented interpreters. To curb code duplication intrinsic to such specialization, we introduce a new program transformation, rewinding, which uses Moore...

  13. The role and tasks of industrial hygienists in occupational and environmental medicine and their code of ethics

    Directory of Open Access Journals (Sweden)

    Jan Grzesik

    2012-12-01

    Full Text Available The paper considers changes in occupational medicine in the last fifty years, describes industrial hygienists tasks and the reasons why their activities grew in importance. Also the needs of compliance with their own professional Code of Ethics are discussed. The Universal Declaration of Human Rights, voted and accepted by the United Nations in 1948 changed the strategic target of occupational medicine. Since then the most important task became prevention of health damage caused by work, which should enable the employees to stay healthy throughout the whole period of professional activity. Before that the main target was to restore the health of employees injured by work. To make the used preventive measures to be effective, they must be selected appropriately to professional harmfulness posing threat to employees health. This requires to reveal all factors potentially harmful to health, which occur in the work-environment, to measure their concentrations or intensity, to determine the employees exposure to those factors and to estimate the level of the health risk, caused by this complex exposure. Contemporarily occupational medicine service encompass with its preventive supervision the municipal environments, because they become seriously polluted due to emission of harmful industrial pollutants what brings about a negative impact to health of exposed dwellers. Those activities, being to a large extent outside the scope of competence and tasks of doctors – specialists of occupational medicine, are performed by industrial hygienists, who the required knowledge and skills acquired during university studies on technical and natural faculties. This caused a substantial increase of the role of industrial hygienists in the present activity of occupational medicine service and simultaneously took into consideration the ethical aspects of the work of these professionals. Evaluation of the backbone and the scope of work drew attention not only to the

  14. Bayesian analogy with relational transformations.

    Science.gov (United States)

    Lu, Hongjing; Chen, Dawn; Holyoak, Keith J

    2012-07-01

    How can humans acquire relational representations that enable analogical inference and other forms of high-level reasoning? Using comparative relations as a model domain, we explore the possibility that bottom-up learning mechanisms applied to objects coded as feature vectors can yield representations of relations sufficient to solve analogy problems. We introduce Bayesian analogy with relational transformations (BART) and apply the model to the task of learning first-order comparative relations (e.g., larger, smaller, fiercer, meeker) from a set of animal pairs. Inputs are coded by vectors of continuous-valued features, based either on human magnitude ratings, normed feature ratings (De Deyne et al., 2008), or outputs of the topics model (Griffiths, Steyvers, & Tenenbaum, 2007). Bootstrapping from empirical priors, the model is able to induce first-order relations represented as probabilistic weight distributions, even when given positive examples only. These learned representations allow classification of novel instantiations of the relations and yield a symbolic distance effect of the sort obtained with both humans and other primates. BART then transforms its learned weight distributions by importance-guided mapping, thereby placing distinct dimensions into correspondence. These transformed representations allow BART to reliably solve 4-term analogies (e.g., larger:smaller::fiercer:meeker), a type of reasoning that is arguably specific to humans. Our results provide a proof-of-concept that structured analogies can be solved with representations induced from unstructured feature vectors by mechanisms that operate in a largely bottom-up fashion. We discuss potential implications for algorithmic and neural models of relational thinking, as well as for the evolution of abstract thought. Copyright 2012 APA, all rights reserved.

  15. Approximating the Analytic Fourier Transform with the Discrete Fourier Transform

    OpenAIRE

    Axelrod, Jeremy

    2015-01-01

    The Fourier transform is approximated over a finite domain using a Riemann sum. This Riemann sum is then expressed in terms of the discrete Fourier transform, which allows the sum to be computed with a fast Fourier transform algorithm more rapidly than via a direct matrix multiplication. Advantages and limitations of using this method to approximate the Fourier transform are discussed, and prototypical MATLAB codes implementing the method are presented.

  16. Nonlinear QR code based optical image encryption using spiral phase transform, equal modulus decomposition and singular value decomposition

    Science.gov (United States)

    Kumar, Ravi; Bhaduri, Basanta; Nishchal, Naveen K.

    2018-01-01

    In this study, we propose a quick response (QR) code based nonlinear optical image encryption technique using spiral phase transform (SPT), equal modulus decomposition (EMD) and singular value decomposition (SVD). First, the primary image is converted into a QR code and then multiplied with a spiral phase mask (SPM). Next, the product is spiral phase transformed with particular spiral phase function, and further, the EMD is performed on the output of SPT, which results into two complex images, Z 1 and Z 2. Among these, Z 1 is further Fresnel propagated with distance d, and Z 2 is reserved as a decryption key. Afterwards, SVD is performed on Fresnel propagated output to get three decomposed matrices i.e. one diagonal matrix and two unitary matrices. The two unitary matrices are modulated with two different SPMs and then, the inverse SVD is performed using the diagonal matrix and modulated unitary matrices to get the final encrypted image. Numerical simulation results confirm the validity and effectiveness of the proposed technique. The proposed technique is robust against noise attack, specific attack, and brutal force attack. Simulation results are presented in support of the proposed idea.

  17. Parallelization of one image compression method. Wavelet, Transform, Vector Quantization and Huffman Coding

    International Nuclear Information System (INIS)

    Moravie, Philippe

    1997-01-01

    Today, in the digitized satellite image domain, the needs for high dimension increase considerably. To transmit or to stock such images (more than 6000 by 6000 pixels), we need to reduce their data volume and so we have to use real-time image compression techniques. The large amount of computations required by image compression algorithms prohibits the use of common sequential processors, for the benefits of parallel computers. The study presented here deals with parallelization of a very efficient image compression scheme, based on three techniques: Wavelets Transform (WT), Vector Quantization (VQ) and Entropic Coding (EC). First, we studied and implemented the parallelism of each algorithm, in order to determine the architectural characteristics needed for real-time image compression. Then, we defined eight parallel architectures: 3 for Mallat algorithm (WT), 3 for Tree-Structured Vector Quantization (VQ) and 2 for Huffman Coding (EC). As our system has to be multi-purpose, we chose 3 global architectures between all of the 3x3x2 systems available. Because, for technological reasons, real-time is not reached at anytime (for all the compression parameter combinations), we also defined and evaluated two algorithmic optimizations: fix point precision and merging entropic coding in vector quantization. As a result, we defined a new multi-purpose multi-SMIMD parallel machine, able to compress digitized satellite image in real-time. The definition of the best suited architecture for real-time image compression was answered by presenting 3 parallel machines among which one multi-purpose, embedded and which might be used for other applications on board. (author) [fr

  18. EEG Analysis during complex diagnostic tasks in Nuclear Power Plants - Simulator-based Experimental Study

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Jun Su; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2005-07-01

    In literature, there are a lot of studies based on EEG signals during cognitive activities of human-beings but most of them dealt with simple cognitive activities such as transforming letters into Morse code, subtraction, reading, semantic memory search, visual search, memorizing a set of words and so on. In this work, EEG signals were analyzed during complex diagnostic tasks in NPP simulator-based environment. Investigated are the theta, alpha, beta, and gamma band EEG powers during the diagnostic tasks. The experimental design and procedure are represented in section 2 and the results are shown in section 3. Finally some considerations are discussed and the direction for the further work is proposed in section 4.

  19. EEG Analysis during complex diagnostic tasks in Nuclear Power Plants - Simulator-based Experimental Study

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2005-01-01

    In literature, there are a lot of studies based on EEG signals during cognitive activities of human-beings but most of them dealt with simple cognitive activities such as transforming letters into Morse code, subtraction, reading, semantic memory search, visual search, memorizing a set of words and so on. In this work, EEG signals were analyzed during complex diagnostic tasks in NPP simulator-based environment. Investigated are the theta, alpha, beta, and gamma band EEG powers during the diagnostic tasks. The experimental design and procedure are represented in section 2 and the results are shown in section 3. Finally some considerations are discussed and the direction for the further work is proposed in section 4

  20. Cracking the code of oscillatory activity.

    Directory of Open Access Journals (Sweden)

    Philippe G Schyns

    2011-05-01

    Full Text Available Neural oscillations are ubiquitous measurements of cognitive processes and dynamic routing and gating of information. The fundamental and so far unresolved problem for neuroscience remains to understand how oscillatory activity in the brain codes information for human cognition. In a biologically relevant cognitive task, we instructed six human observers to categorize facial expressions of emotion while we measured the observers' EEG. We combined state-of-the-art stimulus control with statistical information theory analysis to quantify how the three parameters of oscillations (i.e., power, phase, and frequency code the visual information relevant for behavior in a cognitive task. We make three points: First, we demonstrate that phase codes considerably more information (2.4 times relating to the cognitive task than power. Second, we show that the conjunction of power and phase coding reflects detailed visual features relevant for behavioral response--that is, features of facial expressions predicted by behavior. Third, we demonstrate, in analogy to communication technology, that oscillatory frequencies in the brain multiplex the coding of visual features, increasing coding capacity. Together, our findings about the fundamental coding properties of neural oscillations will redirect the research agenda in neuroscience by establishing the differential role of frequency, phase, and amplitude in coding behaviorally relevant information in the brain.

  1. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    Science.gov (United States)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  2. Limitations in dual-task performance

    OpenAIRE

    Pannebakker, Merel Mathilde

    2009-01-01

    In this thesis, the effect of information-processing overload on working-memory dependent information processing was examined using dual-task paradigms. The experiments described strengthen the importance of a functional explanation for dual-task limitations. First, it showed evidence for a unified coding medium (as put forward in the theory of event coding; Hommel, Müsseler, Aschersleben, & Prinz, 2001) in which features, operations and responses are available and can influence each other. A...

  3. Hadamard Transforms

    CERN Document Server

    Agaian, Sos; Egiazarian, Karen; Astola, Jaakko

    2011-01-01

    The Hadamard matrix and Hadamard transform are fundamental problem-solving tools in a wide spectrum of scientific disciplines and technologies, such as communication systems, signal and image processing (signal representation, coding, filtering, recognition, and watermarking), digital logic (Boolean function analysis and synthesis), and fault-tolerant system design. Hadamard Transforms intends to bring together different topics concerning current developments in Hadamard matrices, transforms, and their applications. Each chapter begins with the basics of the theory, progresses to more advanced

  4. An electrophysiological study of task demands on concreteness effects: evidence for dual coding theory.

    Science.gov (United States)

    Welcome, Suzanne E; Paivio, Allan; McRae, Ken; Joanisse, Marc F

    2011-07-01

    We examined ERP responses during the generation of word associates or mental images in response to concrete and abstract concepts. Of interest were the predictions of dual coding theory (DCT), which proposes that processing lexical concepts depends on functionally independent but interconnected verbal and nonverbal systems. ERP responses were time-locked to either stimulus onset or response to compensate for potential latency differences across conditions. During word associate generation, but not mental imagery, concrete items elicited a greater N400 than abstract items. A concreteness effect emerged at a later time point during the mental imagery task. Data were also analyzed using time-frequency analysis that investigated synchronization of neuronal populations over time during processing. Concrete words elicited an enhanced late going desynchronization of theta-band power (723-938 ms post stimulus onset) during associate generation. During mental imagery, abstract items elicited greater delta-band power from 800 to 1,000 ms following stimulus onset, theta-band power from 350 to 205 ms before response, and alpha-band power from 900 to 800 ms before response. Overall, the findings support DCT in suggesting that lexical concepts are not amodal and that concreteness effects are modulated by tasks that focus participants on verbal versus nonverbal, imagery-based knowledge.

  5. COMPOSE-HPC: A Transformational Approach to Exascale

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, David E [ORNL; Allan, Benjamin A. [Sandia National Laboratories (SNL); Armstrong, Robert C. [Sandia National Laboratories (SNL); Chavarria-Miranda, Daniel [Pacific Northwest National Laboratory (PNNL); Dahlgren, Tamara L. [Lawrence Livermore National Laboratory (LLNL); Elwasif, Wael R [ORNL; Epperly, Tom [Lawrence Livermore National Laboratory (LLNL); Foley, Samantha S [ORNL; Hulette, Geoffrey C. [Sandia National Laboratories (SNL); Krishnamoorthy, Sriram [Pacific Northwest National Laboratory (PNNL); Prantl, Adrian [Lawrence Livermore National Laboratory (LLNL); Panyala, Ajay [Louisiana State University; Sottile, Matthew [Galois, Inc.

    2012-04-01

    The goal of the COMPOSE-HPC project is to 'democratize' tools for automatic transformation of program source code so that it becomes tractable for the developers of scientific applications to create and use their own transformations reliably and safely. This paper describes our approach to this challenge, the creation of the KNOT tool chain, which includes tools for the creation of annotation languages to control the transformations (PAUL), to perform the transformations (ROTE), and optimization and code generation (BRAID), which can be used individually and in combination. We also provide examples of current and future uses of the KNOT tools, which include transforming code to use different programming models and environments, providing tests that can be used to detect errors in software or its execution, as well as composition of software written in different programming languages, or with different threading patterns.

  6. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    Science.gov (United States)

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. © Society for the Experimental Analysis of Behavior.

  7. Mattson Solomon transform and algebra codes

    DEFF Research Database (Denmark)

    Martínez-Moro, E.; Benito, Diego Ruano

    2009-01-01

    In this note we review some results of the first author on the structure of codes defined as subalgebras of a commutative semisimple algebra over a finite field (see Martínez-Moro in Algebra Discrete Math. 3:99-112, 2007). Generator theory and those aspects related to the theory of Gröbner bases ...

  8. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  9. Double-digit coding of examination math problems

    Directory of Open Access Journals (Sweden)

    Agnieszka Sułowska

    2013-09-01

    Full Text Available Various methods are used worldwide to evaluate student solutions to examination tasks. Usually the results simply provide information about student competency and after aggregation, are also used as a tool of making comparisons between schools. In particular, the standard evaluation methods do not allow conclusions to be drawn about possible improvements of teaching methods. There are however, task assessment methods which not only allow description of student achievement, but also possible causes of failure. One such method, which can be applied to extended response tasks, is double-digit coding which has been used in some international educational research. This paper presents the first Polish experiences of applying this method to examination tasks in mathematics, using a special coding key to carry out the evaluation. Lessons learned during the coding key construction and its application in the assessment process are described.

  10. A code for leakage neutron spectra through thick shields

    International Nuclear Information System (INIS)

    Nagarajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.

    1975-01-01

    An exponential transform Monte Carlo code has been developed for deep penetration of neutrons and the results of leakage neutron spectra of this code have been compared with those of a basic Monte Carlo code for small thickness. The development of the code and optimisation of certain transform parameters are discussed and results are presented for a few thick shields of concrete and water in the context of neutron monitoring in the environs of accelerator and reactor shields. (author)

  11. Quantum BCH Codes Based on Spectral Techniques

    International Nuclear Information System (INIS)

    Guo Ying; Zeng Guihua

    2006-01-01

    When the time variable in quantum signal processing is discrete, the Fourier transform exists on the vector space of n-tuples over the Galois field F 2 , which plays an important role in the investigation of quantum signals. By using Fourier transforms, the idea of quantum coding theory can be described in a setting that is much different from that seen that far. Quantum BCH codes can be defined as codes whose quantum states have certain specified consecutive spectral components equal to zero and the error-correcting ability is also described by the number of the consecutive zeros. Moreover, the decoding of quantum codes can be described spectrally with more efficiency.

  12. Evidence for modality-independent order coding in working memory.

    Science.gov (United States)

    Depoorter, Ann; Vandierendonck, André

    2009-03-01

    The aim of the present study was to investigate the representation of serial order in working memory, more specifically whether serial order is coded by means of a modality-dependent or a modality-independent order code. This was investigated by means of a series of four experiments based on a dual-task methodology in which one short-term memory task was embedded between the presentation and recall of another short-term memory task. Two aspects were varied in these memory tasks--namely, the modality of the stimulus materials (verbal or visuo-spatial) and the presence of an order component in the task (an order or an item memory task). The results of this study showed impaired primary-task recognition performance when both the primary and the embedded task included an order component, irrespective of the modality of the stimulus materials. If one or both of the tasks did not contain an order component, less interference was found. The results of this study support the existence of a modality-independent order code.

  13. Effects of visual and verbal interference tasks on olfactory memory: the role of task complexity.

    Science.gov (United States)

    Annett, J M; Leslie, J C

    1996-08-01

    Recent studies have demonstrated that visual and verbal suppression tasks interfere with olfactory memory in a manner which is partially consistent with a dual coding interpretation. However, it has been suggested that total task complexity rather than modality specificity of the suppression tasks might account for the observed pattern of results. This study addressed the issue of whether or not the level of difficulty and complexity of suppression tasks could explain the apparent modality effects noted in earlier experiments. A total of 608 participants were each allocated to one of 19 experimental conditions involving interference tasks which varied suppression type (visual or verbal), nature of complexity (single, double or mixed) and level of difficulty (easy, optimal or difficult) and presented with 13 target odours. Either recognition of the odours or free recall of the odour names was tested on one occasion, either within 15 minutes of presentation or one week later. Both recognition and recall performance showed an overall effect for suppression nature, suppression level and time of testing with no effect for suppression type. The results lend only limited support to Paivio's (1986) dual coding theory, but have a number of characteristics which suggest that an adequate account of olfactory memory may be broadly similar to current theories of face and object recognition. All of these phenomena might be dealt with by an appropriately modified version of dual coding theory.

  14. Interrelations of codes in human semiotic systems.

    OpenAIRE

    Somov, Georgij

    2016-01-01

    Codes can be viewed as mechanisms that enable relations of signs and their components, i.e., semiosis is actualized. The combinations of these relations produce new relations as new codes are building over other codes. Structures appear in the mechanisms of codes. Hence, codes can be described as transformations of structures from some material systems into others. Structures belong to different carriers, but exist in codes in their "pure" form. Building of codes over other codes fosters t...

  15. FJET Database Project: Extract, Transform, and Load

    Science.gov (United States)

    Samms, Kevin O.

    2015-01-01

    The Data Mining & Knowledge Management team at Kennedy Space Center is providing data management services to the Frangible Joint Empirical Test (FJET) project at Langley Research Center (LARC). FJET is a project under the NASA Engineering and Safety Center (NESC). The purpose of FJET is to conduct an assessment of mild detonating fuse (MDF) frangible joints (FJs) for human spacecraft separation tasks in support of the NASA Commercial Crew Program. The Data Mining & Knowledge Management team has been tasked with creating and managing a database for the efficient storage and retrieval of FJET test data. This paper details the Extract, Transform, and Load (ETL) process as it is related to gathering FJET test data into a Microsoft SQL relational database, and making that data available to the data users. Lessons learned, procedures implemented, and programming code samples are discussed to help detail the learning experienced as the Data Mining & Knowledge Management team adapted to changing requirements and new technology while maintaining flexibility of design in various aspects of the data management project.

  16. Code Blue Emergencies: A Team Task Analysis and Educational Initiative.

    Science.gov (United States)

    Price, James W; Applegarth, Oliver; Vu, Mark; Price, John R

    2012-01-01

    The objective of this study was to identify factors that have a positive or negative influence on resuscitation team performance during emergencies in the operating room (OR) and post-operative recovery unit (PAR) at a major Canadian teaching hospital. This information was then used to implement a team training program for code blue emergencies. In 2009/10, all OR and PAR nurses and 19 anesthesiologists at Vancouver General Hospital (VGH) were invited to complete an anonymous, 10 minute written questionnaire regarding their code blue experience. Survey questions were devised by 10 recovery room and operation room nurses as well as 5 anesthesiologists representing 4 different hospitals in British Columbia. Three iterations of the survey were reviewed by a pilot group of nurses and anesthesiologists and their feedback was integrated into the final version of the survey. Both nursing staff (n = 49) and anesthesiologists (n = 19) supported code blue training and believed that team training would improve patient outcome. Nurses noted that it was often difficult to identify the leader of the resuscitation team. Both nursing staff and anesthesiologists strongly agreed that too many people attending the code blue with no assigned role hindered team performance. Identifiable leadership and clear communication of roles were identified as keys to resuscitation team functioning. Decreasing the number of people attending code blue emergencies with no specific role, increased access to mock code blue training, and debriefing after crises were all identified as areas requiring improvement. Initial team training exercises have been well received by staff.

  17. Code Blue Emergencies: A Team Task Analysis and Educational Initiative

    Directory of Open Access Journals (Sweden)

    James W. Price

    2012-04-01

    Full Text Available Introduction: The objective of this study was to identify factors that have a positive or negative influence on resuscitation team performance during emergencies in the operating room (OR and post-operative recovery unit (PAR at a major Canadian teaching hospital. This information was then used to implement a team training program for code blue emergencies. Methods: In 2009/10, all OR and PAR nurses and 19 anesthesiologists at Vancouver General Hospital (VGH were invited to complete an anonymous, 10 minute written questionnaire regarding their code blue experience. Survey questions were devised by 10 recovery room and operation room nurses as well as 5 anesthesiologists representing 4 different hospitals in British Columbia. Three iterations of the survey were reviewed by a pilot group of nurses and anesthesiologists and their feedback was integrated into the final version of the survey. Results: Both nursing staff (n = 49 and anesthesiologists (n = 19 supported code blue training and believed that team training would improve patient outcome. Nurses noted that it was often difficult to identify the leader of the resuscitation team. Both nursing staff and anesthesiologists strongly agreed that too many people attending the code blue with no assigned role hindered team performance. Conclusion: Identifiable leadership and clear communication of roles were identified as keys to resuscitation team functioning. Decreasing the number of people attending code blue emergencies with no specific role, increased access to mock code blue training, and debriefing after crises were all identified as areas requiring improvement. Initial team training exercises have been well received by staff.

  18. FCG: a code generator for lazy functional languages

    NARCIS (Netherlands)

    Kastens, U.; Langendoen, K.G.; Hartel, Pieter H.; Pfahler, P.

    1992-01-01

    The FCGcode generator produces portable code that supports efficient two-space copying garbage collection. The code generator transforms the output of the FAST compiler front end into an abstract machine code. This code explicitly uses a call stack, which is accessible to the garbage collector. In

  19. [Learning virtual routes: what does verbal coding do in working memory?].

    Science.gov (United States)

    Gyselinck, Valérie; Grison, Élise; Gras, Doriane

    2015-03-01

    Two experiments were run to complete our understanding of the role of verbal and visuospatial encoding in the construction of a spatial model from visual input. In experiment 1 a dual task paradigm was applied to young adults who learned a route in a virtual environment and then performed a series of nonverbal tasks to assess spatial knowledge. Results indicated that landmark knowledge as asserted by the visual recognition of landmarks was not impaired by any of the concurrent task. Route knowledge, assessed by recognition of directions, was impaired both by a tapping task and a concurrent articulation task. Interestingly, the pattern was modulated when no landmarks were available to perform the direction task. A second experiment was designed to explore the role of verbal coding on the construction of landmark and route knowledge. A lexical-decision task was used as a verbal-semantic dual task, and a tone decision task as a nonsemantic auditory task. Results show that these new concurrent tasks impaired differently landmark knowledge and route knowledge. Results can be interpreted as showing that the coding of route knowledge could be grounded on both a coding of the sequence of events and on a semantic coding of information. These findings also point on some limits of Baddeley's working memory model. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  20. The Hermite transform-applications

    NARCIS (Netherlands)

    Martens, J.B.

    It is demonstrated how the Hermite transform can be used for image coding and analysis. Hierarchical coding structures based on increasingly specified basic patterns, i.e. general 2-D patterns, general 1-D patterns, and specific 1-D patterns such as edges and corners, are presented. In the image

  1. Computer codes for level 1 probabilistic safety assessment

    International Nuclear Information System (INIS)

    1990-06-01

    Probabilistic Safety Assessment (PSA) entails several laborious tasks suitable for computer codes assistance. This guide identifies these tasks, presents guidelines for selecting and utilizing computer codes in the conduct of the PSA tasks and for the use of PSA results in safety management and provides information on available codes suggested or applied in performing PSA in nuclear power plants. The guidance is intended for use by nuclear power plant system engineers, safety and operating personnel, and regulators. Large efforts are made today to provide PC-based software systems and PSA processed information in a way to enable their use as a safety management tool by the nuclear power plant overall management. Guidelines on the characteristics of software needed for management to prepare a software that meets their specific needs are also provided. Most of these computer codes are also applicable for PSA of other industrial facilities. The scope of this document is limited to computer codes used for the treatment of internal events. It does not address other codes available mainly for the analysis of external events (e.g. seismic analysis) flood and fire analysis. Codes discussed in the document are those used for probabilistic rather than for phenomenological modelling. It should be also appreciated that these guidelines are not intended to lead the user to selection of one specific code. They provide simply criteria for the selection. Refs and tabs

  2. An approach to improving the structure of error-handling code in the linux kernel

    DEFF Research Database (Denmark)

    Saha, Suman; Lawall, Julia; Muller, Gilles

    2011-01-01

    The C language does not provide any abstractions for exception handling or other forms of error handling, leaving programmers to devise their own conventions for detecting and handling errors. The Linux coding style guidelines suggest placing error handling code at the end of each function, where...... an automatic program transformation that transforms error-handling code into this style. We have applied our transformation to the Linux 2.6.34 kernel source code, on which it reorganizes the error handling code of over 1800 functions, in about 25 minutes....

  3. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  4. TASK, 1-D Multigroup Diffusion or Transport Theory Reactor Kinetics with Delayed Neutron

    International Nuclear Information System (INIS)

    Buhl, A.R.; Hermann, O.W.; Hinton, R.J.; Dodds, H.L. Jr.; Robinson, J.C.; Lillie, R.A.

    1975-01-01

    1 - Description of problem or function: TASK solves the one-dimensional multigroup form of the reactor kinetics equations, using either transport or diffusion theory and allowing an arbitrary number of delayed neutron groups. The program can also be used to solve standard static problems efficiently such as eigenvalue problems, distributed source problems, and boundary source problems. Convergence problems associated with sources in highly multiplicative media are circumvented, and such problems are readily calculable. 2 - Method of solution: TASK employs a combination scattering and transfer matrix method to eliminate certain difficulties that arise in classical finite difference approximations. As such, within-group (inner) iterations are eliminated and solution convergence is independent of spatial mesh size. The time variable is removed by Laplace transformation. (A later version will permit direct time solutions.) The code can be run either in an outer iteration mode or in closed (non-iterative) form. The running mode is dictated by the number of groups times the number of angles, consistent with available storage. 3 - Restrictions on the complexity of the problem: The principal restrictions are available storage and computation time. Since the code is flexibly-dimensioned and has an outer iteration option there are no internal restrictions on group structure, quadrature, and number of ordinates. The flexible-dimensioning scheme allows optional use of core storage. The generalized cylindrical geometry option is not complete in Version I of the code. The feedback options and omega-mode search options are not included in Version I

  5. LiveCode mobile development

    CERN Document Server

    Lavieri, Edward D

    2013-01-01

    A practical guide written in a tutorial-style, ""LiveCode Mobile Development Hotshot"" walks you step-by-step through 10 individual projects. Every project is divided into sub tasks to make learning more organized and easy to follow along with explanations, diagrams, screenshots, and downloadable material.This book is great for anyone who wants to develop mobile applications using LiveCode. You should be familiar with LiveCode and have access to a smartphone. You are not expected to know how to create graphics or audio clips.

  6. Experiment with expert system guidance of an engineering analysis task

    International Nuclear Information System (INIS)

    Ransom, V.H.; Fink, R.K.; Callow, R.A.

    1986-01-01

    An experiment is being conducted in which expert systems are used to guide the performance of an engineering analysis task. The task chosen for experimentation is the application of a large thermal hydraulic transient simulation computer code. The expectation from this work is that the expert system will result in an improved analytical result with a reduction in the amount of human labor and expertise required. The code associated functions of model formulation, data input, code execution, and analysis of the computed output have all been identified as candidate tasks that could benefit from the use of expert systems. Expert system modules have been built for the model building and data input task. Initial results include the observation that human expectations of an intelligent environment rapidly escalate and structured or stylized tasks that are tolerated in the unaided system are frustrating within the intelligent environment

  7. On-line monitoring and inservice inspection in codes

    International Nuclear Information System (INIS)

    Bartonicek, J.; Zaiss, W.; Bath, H.R.

    1999-01-01

    The relevant regulatory codes determine the ISI tasks and the time intervals for recurrent components testing for evaluation of operation-induced damaging or ageing in order to ensure component integrity on the basis of the last available quality data. In-service quality monitoring is carried out through on-line monitoring and recurrent testing. The requirements defined by the engineering codes elaborated by various institutions are comparable, with the KTA nuclear engineering and safety codes being the most complete provisions for quality evaluation and assurance after different, defined service periods. German conventional codes for assuring component integrity provide exclusively for recurrent inspection regimes (mainly pressure tests and optical testing). The requirements defined in the KTA codes however always demanded more specific inspections relying on recurrent testing as well as on-line monitoring. Foreign codes for ensuring component integrity concentrate on NDE tasks at regular time intervals, with time intervals scope of testing activities being defined on the basis of the ASME code, section XI. (orig./CB) [de

  8. User instructions for the DESCARTES environmental accumulation code

    International Nuclear Information System (INIS)

    Miley, T.B.; Eslinger, P.W.; Nichols, W.E.; Lessor, K.S.; Ouderkirk, S.J.

    1994-05-01

    The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the Hanford Site near Richland, Washington. The HEDR Project work is conducted under several technical and administrative tasks, among which is the Environmental Pathways and Dose Estimates task. The staff on this task have developed a suite of computer codes which are used to estimate doses to individuals in the public. This document contains the user instructions for the DESCARTES (Dynamic estimates of concentrations and Accumulated Radionuclides in Terrestrial Environments) suite of codes. In addition to the DESCARTES code, this includes two air data preprocessors, a database postprocessor, and several utility routines that are used to format input data needed for DESCARTES

  9. Monomial codes seen as invariant subspaces

    Directory of Open Access Journals (Sweden)

    García-Planas María Isabel

    2017-08-01

    Full Text Available It is well known that cyclic codes are very useful because of their applications, since they are not computationally expensive and encoding can be easily implemented. The relationship between cyclic codes and invariant subspaces is also well known. In this paper a generalization of this relationship is presented between monomial codes over a finite field and hyperinvariant subspaces of n under an appropriate linear transformation. Using techniques of Linear Algebra it is possible to deduce certain properties for this particular type of codes, generalizing known results on cyclic codes.

  10. UEP LT Codes with Intermediate Feedback

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Popovski, Petar; Østergaard, Jan

    2013-01-01

    We analyze a class of rateless codes, called Luby transform (LT) codes with unequal error protection (UEP). We show that while these codes successfully provide UEP, there is a significant price in terms of redundancy in the lower prioritized segments. We propose a modification with a single inter...... intermediate feedback message. Our analysis shows a dramatic improvement on the decoding performance of the lower prioritized segment....

  11. Uncovering the relationship between transformational leaders and followers' task performance.

    NARCIS (Netherlands)

    Breevaart, K.; Bakker, A. B.; Demerouti, E.; Sleebos, D. M.; Maduro, V.

    2015-01-01

    The purpose of the present study was to unravel the mechanisms underlying the relationship between transformational leadership, follower work engagement, and follower job performance and to investigate a possible boundary condition of transformational leadership. We used structural equation modeling

  12. The archaeology of computer codes - illustrated on the basis of the code SABINE

    International Nuclear Information System (INIS)

    Sdouz, G.

    1987-02-01

    Computer codes used by the physics group of the Institute for Reactor Safety are stored on back-up-tapes. However during the last years both the computer and the system have been changed. For new tasks these programmes have to be available. A new procedure is necessary to find and to activate a stored programme. This procedure is illustrated on the basis of the code SABINE. (Author)

  13. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    Directory of Open Access Journals (Sweden)

    Wiktor eMlynarski

    2014-03-01

    Full Text Available To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficientcoding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform - Independent Component Analysis (ICA trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment.

  14. Brain-computer interface analysis of a dynamic visuo-motor task.

    Science.gov (United States)

    Logar, Vito; Belič, Aleš

    2011-01-01

    The area of brain-computer interfaces (BCIs) represents one of the more interesting fields in neurophysiological research, since it investigates the development of the machines that perform different transformations of the brain's "thoughts" to certain pre-defined actions. Experimental studies have reported some successful implementations of BCIs; however, much of the field still remains unexplored. According to some recent reports the phase coding of informational content is an important mechanism in the brain's function and cognition, and has the potential to explain various mechanisms of the brain's data transfer, but it has yet to be scrutinized in the context of brain-computer interface. Therefore, if the mechanism of phase coding is plausible, one should be able to extract the phase-coded content, carried by brain signals, using appropriate signal-processing methods. In our previous studies we have shown that by using a phase-demodulation-based signal-processing approach it is possible to decode some relevant information on the current motor action in the brain from electroencephalographic (EEG) data. In this paper the authors would like to present a continuation of their previous work on the brain-information-decoding analysis of visuo-motor (VM) tasks. The present study shows that EEG data measured during more complex, dynamic visuo-motor (dVM) tasks carries enough information about the currently performed motor action to be successfully extracted by using the appropriate signal-processing and identification methods. The aim of this paper is therefore to present a mathematical model, which by means of the EEG measurements as its inputs predicts the course of the wrist movements as applied by each subject during the task in simulated or real time (BCI analysis). However, several modifications to the existing methodology are needed to achieve optimal decoding results and a real-time, data-processing ability. The information extracted from the EEG could

  15. Robot Task Commander with Extensible Programming Environment

    Science.gov (United States)

    Hart, Stephen W (Inventor); Yamokoski, John D. (Inventor); Wightman, Brian J (Inventor); Dinh, Duy Paul (Inventor); Gooding, Dustin R (Inventor)

    2014-01-01

    A system for developing distributed robot application-level software includes a robot having an associated control module which controls motion of the robot in response to a commanded task, and a robot task commander (RTC) in networked communication with the control module over a network transport layer (NTL). The RTC includes a script engine(s) and a GUI, with a processor and a centralized library of library blocks constructed from an interpretive computer programming code and having input and output connections. The GUI provides access to a Visual Programming Language (VPL) environment and a text editor. In executing a method, the VPL is opened, a task for the robot is built from the code library blocks, and data is assigned to input and output connections identifying input and output data for each block. A task sequence(s) is sent to the control module(s) over the NTL to command execution of the task.

  16. Task-oriented maximally entangled states

    International Nuclear Information System (INIS)

    Agrawal, Pankaj; Pradhan, B

    2010-01-01

    We introduce the notion of a task-oriented maximally entangled state (TMES). This notion depends on the task for which a quantum state is used as the resource. TMESs are the states that can be used to carry out the task maximally. This concept may be more useful than that of a general maximally entangled state in the case of a multipartite system. We illustrate this idea by giving an operational definition of maximally entangled states on the basis of communication tasks of teleportation and superdense coding. We also give examples and a procedure to obtain such TMESs for n-qubit systems.

  17. Model-Driven Engineering of Machine Executable Code

    Science.gov (United States)

    Eichberg, Michael; Monperrus, Martin; Kloppenburg, Sven; Mezini, Mira

    Implementing static analyses of machine-level executable code is labor intensive and complex. We show how to leverage model-driven engineering to facilitate the design and implementation of programs doing static analyses. Further, we report on important lessons learned on the benefits and drawbacks while using the following technologies: using the Scala programming language as target of code generation, using XML-Schema to express a metamodel, and using XSLT to implement (a) transformations and (b) a lint like tool. Finally, we report on the use of Prolog for writing model transformations.

  18. Channel coding techniques for wireless communications

    CERN Document Server

    Deergha Rao, K

    2015-01-01

    The book discusses modern channel coding techniques for wireless communications such as turbo codes, low-density parity check (LDPC) codes, space–time (ST) coding, RS (or Reed–Solomon) codes and convolutional codes. Many illustrative examples are included in each chapter for easy understanding of the coding techniques. The text is integrated with MATLAB-based programs to enhance the understanding of the subject’s underlying theories. It includes current topics of increasing importance such as turbo codes, LDPC codes, Luby transform (LT) codes, Raptor codes, and ST coding in detail, in addition to the traditional codes such as cyclic codes, BCH (or Bose–Chaudhuri–Hocquenghem) and RS codes and convolutional codes. Multiple-input and multiple-output (MIMO) communications is a multiple antenna technology, which is an effective method for high-speed or high-reliability wireless communications. PC-based MATLAB m-files for the illustrative examples are provided on the book page on Springer.com for free dow...

  19. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  20. Geometric Transformations in Engineering Geometry

    Directory of Open Access Journals (Sweden)

    I. F. Borovikov

    2015-01-01

    Full Text Available Recently, for business purposes, in view of current trends and world experience in training engineers, research and faculty staff there has been a need to transform traditional courses of descriptive geometry into the course of engineering geometry in which the geometrical transformations have to become its main section. On the basis of critical analysis the paper gives suggestions to improve a presentation technique of this section both in the classroom and in academic literature, extend an application scope of geometrical transformations to solve the position and metric tasks and simulation of surfaces, as well as to design complex engineering configurations, which meet a number of pre-specified conditions.The article offers to make a number of considerable amendments to the terms and definitions used in the existing courses of descriptive geometry. It draws some conclusions and makes the appropriate proposals on feasibility of coordination in teaching the movement transformation in the courses of analytical and descriptive geometry. This will provide interdisciplinary team teaching and allow students to be convinced that a combination of analytical and graphic ways to solve geometric tasks is useful and reasonable.The traditional sections of learning courses need to be added with a theory of projective and bi-rational transformations. In terms of application simplicity and convenience it is enough to consider the central transformations when solving the applied tasks. These transformations contain a beam of sub-invariant (low-invariant straight lines on which the invariant curve induces non-involution and involution projectivities. The expediency of nonlinear transformations application is shown in the article by a specific example of geometric modeling of the interfacing surface "spar-blade".Implementation of these suggestions will contribute to a real transformation of a traditional course of descriptive geometry to the engineering geometry

  1. Construction of FuzzyFind Dictionary using Golay Coding Transformation for Searching Applications

    Science.gov (United States)

    Kowsari, Kamram

    2015-03-01

    searching through a large volume of data is very critical for companies, scientists, and searching engines applications due to time complexity and memory complexity. In this paper, a new technique of generating FuzzyFind Dictionary for text mining was introduced. We simply mapped the 23 bits of the English alphabet into a FuzzyFind Dictionary or more than 23 bits by using more FuzzyFind Dictionary, and reflecting the presence or absence of particular letters. This representation preserves closeness of word distortions in terms of closeness of the created binary vectors within Hamming distance of 2 deviations. This paper talks about the Golay Coding Transformation Hash Table and how it can be used on a FuzzyFind Dictionary as a new technology for using in searching through big data. This method is introduced by linear time complexity for generating the dictionary and constant time complexity to access the data and update by new data sets, also updating for new data sets is linear time depends on new data points. This technique is based on searching only for letters of English that each segment has 23 bits, and also we have more than 23-bit and also it could work with more segments as reference table.

  2. Transform Decoding of Reed-Solomon Codes. Volume II. Logical Design and Implementation.

    Science.gov (United States)

    1982-11-01

    i A. nE aib’ = a(bJ) ; j=0, 1, ... , n-l (2-8) i=01 Similarly, the inverse transform is obtained by interpolation of the polynomial a(z) from its n...with the transform so that either a forward or an inverse transform may be used to encode. The only requirement is that tie reverse of the encoding... inverse transform of the received sequence is the polynomial sum r(z) = e(z) + a(z), where e(z) is the inverse transform of the error polynomial E(z), and a

  3. Cross-band noise model refinement for transform domain Wyner–Ziv video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2012-01-01

    TDWZ video coding trails that of conventional video coding solutions, mainly due to the quality of side information, inaccurate noise modeling and loss in the final coding step. The major goal of this paper is to enhance the accuracy of the noise modeling, which is one of the most important aspects...... influencing the coding performance of DVC. A TDWZ video decoder with a novel cross-band based adaptive noise model is proposed, and a noise residue refinement scheme is introduced to successively update the estimated noise residue for noise modeling after each bit-plane. Experimental results show...... that the proposed noise model and noise residue refinement scheme can improve the rate-distortion (RD) performance of TDWZ video coding significantly. The quality of the side information modeling is also evaluated by a measure of the ideal code length....

  4. You know the Science. Do you know your Code?

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk is about automated code analysis and transformation tools to support scientific computing. Code bases are difficult to manage because of size, age, or safety requirements. Tools can help scientists and IT engineers understand their code, locate problems, improve quality. Tools can also help transform the code, by implementing complex refactorings, replatforming, or migration to a modern language. Such tools are themselves difficult to build. This talk describes DMS, a meta-tool for building software analysis tools. DMS is a kind of generalized compiler, and can be configured to process arbitrary programming languages, to carry out arbitrary analyses, and to convert specifications into running code. It has been used for a variety of purposes, including converting embedded mission software in the US B-2 Stealth Bomber, providing the US Social Security Administration with a deep view how their 200 millions lines of COBOL are connected, and reverse-engineering legacy factory process control code i...

  5. Orthogonal transformations for change detection, Matlab code (ENVI-like headers)

    DEFF Research Database (Denmark)

    2007-01-01

    Matlab code to do (iteratively reweighted) multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data; accommodates ENVI (like) header files.......Matlab code to do (iteratively reweighted) multivariate alteration detection (MAD) analysis, maximum autocorrelation factor (MAF) analysis, canonical correlation analysis (CCA) and principal component analysis (PCA) on image data; accommodates ENVI (like) header files....

  6. Organizing Core Tasks

    DEFF Research Database (Denmark)

    Boll, Karen

    has remained much the same within the last 10 years. However, how the core task has been organized has changed considerable under the influence of various “organizing devices”. The paper focusses on how organizing devices such as risk assessment, output-focus, effect orientation, and treatment...... projects influence the organization of core tasks within the tax administration. The paper shows that the organizational transformations based on the use of these devices have had consequences both for the overall collection of revenue and for the employees’ feeling of “making a difference”. All in all...

  7. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  8. Multi-task pose-invariant face recognition.

    Science.gov (United States)

    Ding, Changxing; Xu, Chang; Tao, Dacheng

    2015-03-01

    Face images captured in unconstrained environments usually contain significant pose variation, which dramatically degrades the performance of algorithms designed to recognize frontal faces. This paper proposes a novel face identification framework capable of handling the full range of pose variations within ±90° of yaw. The proposed framework first transforms the original pose-invariant face recognition problem into a partial frontal face recognition problem. A robust patch-based face representation scheme is then developed to represent the synthesized partial frontal faces. For each patch, a transformation dictionary is learnt under the proposed multi-task learning scheme. The transformation dictionary transforms the features of different poses into a discriminative subspace. Finally, face matching is performed at patch level rather than at the holistic level. Extensive and systematic experimentation on FERET, CMU-PIE, and Multi-PIE databases shows that the proposed method consistently outperforms single-task-based baselines as well as state-of-the-art methods for the pose problem. We further extend the proposed algorithm for the unconstrained face verification problem and achieve top-level performance on the challenging LFW data set.

  9. Task Refinement for Autonomous Robots using Complementary Corrective Human Feedback

    Directory of Open Access Journals (Sweden)

    Cetin Mericli

    2011-06-01

    Full Text Available A robot can perform a given task through a policy that maps its sensed state to appropriate actions. We assume that a hand-coded controller can achieve such a mapping only for the basic cases of the task. Refining the controller becomes harder and gets more tedious and error prone as the complexity of the task increases. In this paper, we present a new learning from demonstration approach to improve the robot's performance through the use of corrective human feedback as a complement to an existing hand-coded algorithm. The human teacher observes the robot as it performs the task using the hand-coded algorithm and takes over the control to correct the behavior when the robot selects a wrong action to be executed. Corrections are captured as new state-action pairs and the default controller output is replaced by the demonstrated corrections during autonomous execution when the current state of the robot is decided to be similar to a previously corrected state in the correction database. The proposed approach is applied to a complex ball dribbling task performed against stationary defender robots in a robot soccer scenario, where physical Aldebaran Nao humanoid robots are used. The results of our experiments show an improvement in the robot's performance when the default hand-coded controller is augmented with corrective human demonstration.

  10. Dual-Task Crosstalk between Saccades and Manual Responses

    Science.gov (United States)

    Huestegge, Lynn; Koch, Iring

    2009-01-01

    Between-task crosstalk has been discussed as an important source for dual-task costs. In this study, the authors examine concurrently performed saccades and manual responses as a means of studying the role of response-code conflict between 2 tasks. In Experiment 1, participants responded to an imperative auditory stimulus with a left or a right…

  11. Genetic Code Analysis Toolkit: A novel tool to explore the coding properties of the genetic code and DNA sequences

    Science.gov (United States)

    Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.

    2018-01-01

    The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/

  12. The task-relevant attribute representation can mediate the Simon effect.

    Directory of Open Access Journals (Sweden)

    Dandan Tang

    Full Text Available Researchers have previously suggested a working memory (WM account of spatial codes, and based on this suggestion, the present study carries out three experiments to investigate how the task-relevant attribute representation (verbal or visual in the typical Simon task affects the Simon effect. Experiment 1 compared the Simon effect between the between- and within-category color conditions, which required subjects to discriminate between red and blue stimuli (presumed to be represented by verbal WM codes because it was easy and fast to name the colors verbally and to discriminate between two similar green stimuli (presumed to be represented by visual WM codes because it was hard and time-consuming to name the colors verbally, respectively. The results revealed a reliable Simon effect that only occurs in the between-category condition. Experiment 2 assessed the Simon effect by requiring subjects to discriminate between two different isosceles trapezoids (within-category shapes and to discriminate isosceles trapezoid from rectangle (between-category shapes, and the results replicated and expanded the findings of Experiment 1. In Experiment 3, subjects were required to perform both tasks from Experiment 1. Wherein, in Experiment 3A, the between-category task preceded the within-category task; in Experiment 3B, the task order was opposite. The results showed the reliable Simon effect when subjects represented the task-relevant stimulus attributes by verbal WM encoding. In addition, the response times (RTs distribution analysis for both the between- and within-category conditions of Experiments 3A and 3B showed decreased Simon effect with the RTs lengthened. Altogether, although the present results are consistent with the temporal coding account, we put forth that the Simon effect also depends on the verbal WM representation of task-relevant stimulus attribute.

  13. Using ADA Tasks to Simulate Operating Equipment

    Science.gov (United States)

    DeAcetis, Louis A.; Schmidt, Oron; Krishen, Kumar

    1990-01-01

    A method of simulating equipment using ADA tasks is discussed. Individual units of equipment are coded as concurrently running tasks that monitor and respond to input signals. This technique has been used in a simulation of the space-to-ground Communications and Tracking subsystem of Space Station Freedom.

  14. Design and implementation in VHDL code of the two-dimensional fast Fourier transform for frequency filtering, convolution and correlation operations

    Science.gov (United States)

    Vilardy, Juan M.; Giacometto, F.; Torres, C. O.; Mattos, L.

    2011-01-01

    The two-dimensional Fast Fourier Transform (FFT 2D) is an essential tool in the two-dimensional discrete signals analysis and processing, which allows developing a large number of applications. This article shows the description and synthesis in VHDL code of the FFT 2D with fixed point binary representation using the programming tool Simulink HDL Coder of Matlab; showing a quick and easy way to handle overflow, underflow and the creation registers, adders and multipliers of complex data in VHDL and as well as the generation of test bench for verification of the codes generated in the ModelSim tool. The main objective of development of the hardware architecture of the FFT 2D focuses on the subsequent completion of the following operations applied to images: frequency filtering, convolution and correlation. The description and synthesis of the hardware architecture uses the XC3S1200E family Spartan 3E FPGA from Xilinx Manufacturer.

  15. Captioning Transformer with Stacked Attention Modules

    Directory of Open Access Journals (Sweden)

    Xinxin Zhu

    2018-05-01

    Full Text Available Image captioning is a challenging task. Meanwhile, it is important for the machine to understand the meaning of an image better. In recent years, the image captioning usually use the long-short-term-memory (LSTM as the decoder to generate the sentence, and these models show excellent performance. Although the LSTM can memorize dependencies, the LSTM structure has complicated and inherently sequential across time problems. To address these issues, recent works have shown benefits of the Transformer for machine translation. Inspired by their success, we develop a Captioning Transformer (CT model with stacked attention modules. We attempt to introduce the Transformer to the image captioning task. The CT model contains only attention modules without the dependencies of the time. It not only can memorize dependencies between the sequence but also can be trained in parallel. Moreover, we propose the multi-level supervision to make the Transformer achieve better performance. Extensive experiments are carried out on the challenging MSCOCO dataset and the proposed Captioning Transformer achieves competitive performance compared with some state-of-the-art methods.

  16. Uncovering the underlying relationship between transformational leaders and followers’ task performance

    NARCIS (Netherlands)

    Breevaart, K.; Bakker, A.B.; Demerouti, E.; Sleebos, D.M.; Maduro, V.

    2014-01-01

    The purpose of the present study was to unravel the mechanisms underlying the relationship between transformational leadership, follower work engagement, and follower job performance and to investigate a possible boundary condition of transformational leadership. We used structural equation modeling

  17. Predictive performance models and multiple task performance

    Science.gov (United States)

    Wickens, Christopher D.; Larish, Inge; Contorer, Aaron

    1989-01-01

    Five models that predict how performance of multiple tasks will interact in complex task scenarios are discussed. The models are shown in terms of the assumptions they make about human operator divided attention. The different assumptions about attention are then empirically validated in a multitask helicopter flight simulation. It is concluded from this simulation that the most important assumption relates to the coding of demand level of different component tasks.

  18. A neural mechanism of dynamic gating of task-relevant information by top-down influence in primary visual cortex.

    Science.gov (United States)

    Kamiyama, Akikazu; Fujita, Kazuhisa; Kashimori, Yoshiki

    2016-12-01

    Visual recognition involves bidirectional information flow, which consists of bottom-up information coding from retina and top-down information coding from higher visual areas. Recent studies have demonstrated the involvement of early visual areas such as primary visual area (V1) in recognition and memory formation. V1 neurons are not passive transformers of sensory inputs but work as adaptive processor, changing their function according to behavioral context. Top-down signals affect tuning property of V1 neurons and contribute to the gating of sensory information relevant to behavior. However, little is known about the neuronal mechanism underlying the gating of task-relevant information in V1. To address this issue, we focus on task-dependent tuning modulations of V1 neurons in two tasks of perceptual learning. We develop a model of the V1, which receives feedforward input from lateral geniculate nucleus and top-down input from a higher visual area. We show here that the change in a balance between excitation and inhibition in V1 connectivity is necessary for gating task-relevant information in V1. The balance change well accounts for the modulations of tuning characteristic and temporal properties of V1 neuronal responses. We also show that the balance change of V1 connectivity is shaped by top-down signals with temporal correlations reflecting the perceptual strategies of the two tasks. We propose a learning mechanism by which synaptic balance is modulated. To conclude, top-down signal changes the synaptic balance between excitation and inhibition in V1 connectivity, enabling early visual area such as V1 to gate context-dependent information under multiple task performances. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Transforming Cobol Legacy Software to a Generic Imperative Model

    National Research Council Canada - National Science Library

    Moraes, DinaL

    1999-01-01

    .... This research develops a transformation system to convert COBOL code into a generic imperative model, recapturing the initial design and deciphering the requirements implemented by the legacy code...

  20. Elements of algebraic coding systems

    CERN Document Server

    Cardoso da Rocha, Jr, Valdemar

    2014-01-01

    Elements of Algebraic Coding Systems is an introductory text to algebraic coding theory. In the first chapter, you'll gain inside knowledge of coding fundamentals, which is essential for a deeper understanding of state-of-the-art coding systems. This book is a quick reference for those who are unfamiliar with this topic, as well as for use with specific applications such as cryptography and communication. Linear error-correcting block codes through elementary principles span eleven chapters of the text. Cyclic codes, some finite field algebra, Goppa codes, algebraic decoding algorithms, and applications in public-key cryptography and secret-key cryptography are discussed, including problems and solutions at the end of each chapter. Three appendices cover the Gilbert bound and some related derivations, a derivation of the Mac- Williams' identities based on the probability of undetected error, and two important tools for algebraic decoding-namely, the finite field Fourier transform and the Euclidean algorithm f...

  1. Coding for Language Complexity: The Interplay among Methodological Commitments, Tools, and Workflow in Writing Research

    Science.gov (United States)

    Geisler, Cheryl

    2018-01-01

    Coding, the analytic task of assigning codes to nonnumeric data, is foundational to writing research. A rich discussion of methodological pluralism has established the foundational importance of systematicity in the task of coding, but less attention has been paid to the equally important commitment to language complexity. Addressing the interplay…

  2. Noise Residual Learning for Noise Modeling in Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Forchhammer, Søren

    2012-01-01

    Distributed video coding (DVC) is a coding paradigm which exploits the source statistics at the decoder side to reduce the complexity at the encoder. The noise model is one of the inherently difficult challenges in DVC. This paper considers Transform Domain Wyner-Ziv (TDWZ) coding and proposes...

  3. RIA Fuel Codes Benchmark - Volume 1

    International Nuclear Information System (INIS)

    Marchand, Olivier; Georgenthum, Vincent; Petit, Marc; Udagawa, Yutaka; Nagase, Fumihisa; Sugiyama, Tomoyuki; Arffman, Asko; Cherubini, Marco; Dostal, Martin; Klouzal, Jan; Geelhood, Kenneth; Gorzel, Andreas; Holt, Lars; Jernkvist, Lars Olof; Khvostov, Grigori; Maertens, Dietmar; Spykman, Gerold; Nakajima, Tetsuo; Nechaeva, Olga; Panka, Istvan; Rey Gayo, Jose M.; Sagrado Garcia, Inmaculada C.; Shin, An-Dong; Sonnenburg, Heinz Guenther; Umidova, Zeynab; Zhang, Jinzhao; Voglewede, John

    2013-01-01

    Reactivity-initiated accident (RIA) fuel rod codes have been developed for a significant period of time and they all have shown their ability to reproduce some experimental results with a certain degree of adequacy. However, they sometimes rely on different specific modelling assumptions the influence of which on the final results of the calculations is difficult to evaluate. The NEA Working Group on Fuel Safety (WGFS) is tasked with advancing the understanding of fuel safety issues by assessing the technical basis for current safety criteria and their applicability to high burnup and to new fuel designs and materials. The group aims at facilitating international convergence in this area, including the review of experimental approaches as well as the interpretation and use of experimental data relevant for safety. As a contribution to this task, WGFS conducted a RIA code benchmark based on RIA tests performed in the Nuclear Safety Research Reactor in Tokai, Japan and tests performed or planned in CABRI reactor in Cadarache, France. Emphasis was on assessment of different modelling options for RIA fuel rod codes in terms of reproducing experimental results as well as extrapolating to typical reactor conditions. This report provides a summary of the results of this task. (authors)

  4. Creation of the Naturalistic Engagement in Secondary Tasks (NEST) distracted driving dataset.

    Science.gov (United States)

    Owens, Justin M; Angell, Linda; Hankey, Jonathan M; Foley, James; Ebe, Kazutoshi

    2015-09-01

    Distracted driving has become a topic of critical importance to driving safety research over the past several decades. Naturalistic driving data offer a unique opportunity to study how drivers engage with secondary tasks in real-world driving; however, the complexities involved with identifying and coding relevant epochs of naturalistic data have limited its accessibility to the general research community. This project was developed to help address this problem by creating an accessible dataset of driver behavior and situational factors observed during distraction-related safety-critical events and baseline driving epochs, using the Strategic Highway Research Program 2 (SHRP2) naturalistic dataset. The new NEST (Naturalistic Engagement in Secondary Tasks) dataset was created using crashes and near-crashes from the SHRP2 dataset that were identified as including secondary task engagement as a potential contributing factor. Data coding included frame-by-frame video analysis of secondary task and hands-on-wheel activity, as well as summary event information. In addition, information about each secondary task engagement within the trip prior to the crash/near-crash was coded at a higher level. Data were also coded for four baseline epochs and trips per safety-critical event. 1,180 events and baseline epochs were coded, and a dataset was constructed. The project team is currently working to determine the most useful way to allow broad public access to the dataset. We anticipate that the NEST dataset will be extraordinarily useful in allowing qualified researchers access to timely, real-world data concerning how drivers interact with secondary tasks during safety-critical events and baseline driving. The coded dataset developed for this project will allow future researchers to have access to detailed data on driver secondary task engagement in the real world. It will be useful for standalone research, as well as for integration with additional SHRP2 data to enable the

  5. RetroTransformDB: A Dataset of Generic Transforms for Retrosynthetic Analysis

    Directory of Open Access Journals (Sweden)

    Svetlana Avramova

    2018-04-01

    Full Text Available Presently, software tools for retrosynthetic analysis are widely used by organic, medicinal, and computational chemists. Rule-based systems extensively use collections of retro-reactions (transforms. While there are many public datasets with reactions in synthetic direction (usually non-generic reactions, there are no publicly-available databases with generic reactions in computer-readable format which can be used for the purposes of retrosynthetic analysis. Here we present RetroTransformDB—a dataset of transforms, compiled and coded in SMIRKS line notation by us. The collection is comprised of more than 100 records, with each one including the reaction name, SMIRKS linear notation, the functional group to be obtained, and the transform type classification. All SMIRKS transforms were tested syntactically, semantically, and from a chemical point of view in different software platforms. The overall dataset design and the retrosynthetic fitness were analyzed and curated by organic chemistry experts. The RetroTransformDB dataset may be used by open-source and commercial software packages, as well as chemoinformatics tools.

  6. Transform Decoding of Reed-Solomon Codes. Volume I. Algorithm and Signal Processing Structure

    Science.gov (United States)

    1982-11-01

    systematic channel co.’e. 1. lake the inverse transform of the r- ceived se, - nee. 2. Isolate the error syndrome from the inverse transform and use... inverse transform is identic l with interpolation of the polynomial a(z) from its n values. In order to generate a Reed-Solomon (n,k) cooce, we let the set...in accordance with the transform of equation (4). If we were to apply the inverse transform of equa- tion (6) to the coefficient sequence of A(z), we

  7. Computer codes for tasks in the fields of isotope and radiation research

    International Nuclear Information System (INIS)

    Friedrich, K.; Gebhardt, O.

    1978-11-01

    Concise descriptions of computer codes developed for solving problems in the fields of isotope and radiation research at the Zentralinstitut fuer Isotopen- und Strahlenforschung (ZfI) are compiled. In part two the structure of the ZfI program library MABIF is outlined and a complete list of all codes available is given

  8. Information preserving coding for multispectral data

    Science.gov (United States)

    Duan, J. R.; Wintz, P. A.

    1973-01-01

    A general formulation of the data compression system is presented. A method of instantaneous expansion of quantization levels by reserving two codewords in the codebook to perform a folding over in quantization is implemented for error free coding of data with incomplete knowledge of the probability density function. Results for simple DPCM with folding and an adaptive transform coding technique followed by a DPCM technique are compared using ERTS-1 data.

  9. Distinct timescales of population coding across cortex.

    Science.gov (United States)

    Runyan, Caroline A; Piasini, Eugenio; Panzeri, Stefano; Harvey, Christopher D

    2017-08-03

    The cortex represents information across widely varying timescales. For instance, sensory cortex encodes stimuli that fluctuate over few tens of milliseconds, whereas in association cortex behavioural choices can require the maintenance of information over seconds. However, it remains poorly understood whether diverse timescales result mostly from features intrinsic to individual neurons or from neuronal population activity. This question remains unanswered, because the timescales of coding in populations of neurons have not been studied extensively, and population codes have not been compared systematically across cortical regions. Here we show that population codes can be essential to achieve long coding timescales. Furthermore, we find that the properties of population codes differ between sensory and association cortices. We compared coding for sensory stimuli and behavioural choices in auditory cortex and posterior parietal cortex as mice performed a sound localization task. Auditory stimulus information was stronger in auditory cortex than in posterior parietal cortex, and both regions contained choice information. Although auditory cortex and posterior parietal cortex coded information by tiling in time neurons that were transiently informative for approximately 200 milliseconds, the areas had major differences in functional coupling between neurons, measured as activity correlations that could not be explained by task events. Coupling among posterior parietal cortex neurons was strong and extended over long time lags, whereas coupling among auditory cortex neurons was weak and short-lived. Stronger coupling in posterior parietal cortex led to a population code with long timescales and a representation of choice that remained consistent for approximately 1 second. In contrast, auditory cortex had a code with rapid fluctuations in stimulus and choice information over hundreds of milliseconds. Our results reveal that population codes differ across cortex

  10. Building energy, building leadership : recommendations for the adoption, development, and implementation of a commercial building energy code in Manitoba

    Energy Technology Data Exchange (ETDEWEB)

    Akerstream, T. [Manitoba Hydro, Winnipeg, MB (Canada); Allard, K. [City of Thompson, Thompson, MB (Canada); Anderson, N.; Beacham, D. [Manitoba Office of the Fire Commissioner, Winnipeg, MB (Canada); Andrich, R. [The Forks North Portage Partnership, MB (Canada); Auger, A. [Natural Resources Canada, Ottawa, ON (Canada). Office of Energy Efficiency; Downs, R.G. [Shindico Realty Inc., Winnipeg, MB (Canada); Eastwood, R. [Number Ten Architectural Group, Winnipeg, MB (Canada); Hewitt, C. [SMS Engineering Ltd., Winnipeg, MB (Canada); Joshi, D. [City of Winnipeg, Winnipeg, MB (Canada); Klassen, K. [Manitoba Dept. of Energy Science and Technology, Winnipeg, MB (Canada); Phillips, B. [Unies Ltd., Winnipeg, MB (Canada); Wiebe, R. [Ben Wiebe Construction Ltd., Winnipeg, MB (Canada); Woelk, D. [Bockstael Construction Ltd., Winnipeg, MB (Canada); Ziemski, S. [CREIT Management LLP, Winnipeg, MB (Canada)

    2006-09-15

    This report presented a strategy and a set of recommendations for the adoption, development and implementation of an energy code for new commercial construction in Manitoba. The report was compiled by an advisory committee comprised of industry representatives and government agency representatives. Recommendations were divided into 4 categories: (1) advisory committee recommendations; (2) code adoption recommendations; (3) code development recommendations; and (4) code implementation recommendations. It was suggested that Manitoba should adopt an amended version of the Model National Energy Code for Buildings (1997) as a regulation under the Buildings and Mobile Homes Act. Participation in a national initiative to update the Model National Energy Code for Buildings was also advised. It was suggested that the energy code should be considered as the first step in a longer-term process towards a sustainable commercial building code. However, the code should be adopted within the context of a complete market transformation approach. Other recommendations included: the establishment of a multi-stakeholder energy code task group; the provision of information and technical resources to help build industry capacity; the establishment of a process for energy code compliance; and an ongoing review of the energy code to assess impacts and progress. Supplemental recommendations for future discussion included the need for integrated design by building design teams in Manitoba; the development of a program to provide technical assistance to building design teams; and collaboration between post-secondary institutions to develop and deliver courses on integrated building design to students and professionals. 17 refs.

  11. Optical image encryption based on real-valued coding and subtracting with the help of QR code

    Science.gov (United States)

    Deng, Xiaopeng

    2015-08-01

    A novel optical image encryption based on real-valued coding and subtracting is proposed with the help of quick response (QR) code. In the encryption process, the original image to be encoded is firstly transformed into the corresponding QR code, and then the corresponding QR code is encoded into two phase-only masks (POMs) by using basic vector operations. Finally, the absolute values of the real or imaginary parts of the two POMs are chosen as the ciphertexts. In decryption process, the QR code can be approximately restored by recording the intensity of the subtraction between the ciphertexts, and hence the original image can be retrieved without any quality loss by scanning the restored QR code with a smartphone. Simulation results and actual smartphone collected results show that the method is feasible and has strong tolerance to noise, phase difference and ratio between intensities of the two decryption light beams.

  12. Fast Transform Decoding Of Nonsystematic Reed-Solomon Codes

    Science.gov (United States)

    Truong, Trieu-Kie; Cheung, Kar-Ming; Shiozaki, A.; Reed, Irving S.

    1992-01-01

    Fast, efficient Fermat number transform used to compute F'(x) analogous to computation of syndrome in conventional decoding scheme. Eliminates polynomial multiplications and reduces number of multiplications in reconstruction of F'(x) to n log (n). Euclidean algorithm used to evaluate F(x) directly, without going through intermediate steps of solving error-locator and error-evaluator polynomials. Algorithm suitable for implementation in very-large-scale integrated circuits.

  13. Semantic and phonological coding in poor and normal readers.

    Science.gov (United States)

    Vellutino, F R; Scanlon, D M; Spearing, D

    1995-02-01

    Three studies were conducted evaluating semantic and phonological coding deficits as alternative explanations of reading disability. In the first study, poor and normal readers in second and sixth grade were compared on various tests evaluating semantic development as well as on tests evaluating rapid naming and pseudoword decoding as independent measures of phonological coding ability. In a second study, the same subjects were given verbal memory and visual-verbal learning tasks using high and low meaning words as verbal stimuli and Chinese ideographs as visual stimuli. On the semantic tasks, poor readers performed below the level of the normal readers only at the sixth grade level, but, on the rapid naming and pseudoword learning tasks, they performed below the normal readers at the second as well as at the sixth grade level. On both the verbal memory and visual-verbal learning tasks, performance in poor readers approximated that of normal readers when the word stimuli were high in meaning but not when they were low in meaning. These patterns were essentially replicated in a third study that used some of the same semantic and phonological measures used in the first experiment, and verbal memory and visual-verbal learning tasks that employed word lists and visual stimuli (novel alphabetic characters) that more closely approximated those used in learning to read. It was concluded that semantic coding deficits are an unlikely cause of reading difficulties in most poor readers at the beginning stages of reading skills acquisition, but accrue as a consequence of prolonged reading difficulties in older readers. It was also concluded that phonological coding deficits are a probable cause of reading difficulties in most poor readers.

  14. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  15. A Measure of Search Efficiency in a Real World Search Task (PREPRINT)

    Science.gov (United States)

    2009-02-16

    Search Task 5a. CONTRACT NUMBER N00173-08-1-G030 5b. GRANT NUMBER NRL BAA 08-09, 55-07-01 5c. PROGRAM ELEMENT NUMBER 0602782N 6. AUTHOR(S... Beck , Melissa R. Ph.D (LSU) Maura C. Lohrenz (NRL Code 7440.1) J. Gregory Trafton (NRL Code 5515) 5d. PROJECT NUMBER 08294 5e. TASK NUMBER... Beck 19b. TELEPHONE NUMBER (Include area code) (225)578-7214 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 A measure of search

  16. Retrofilling of Railroad Transformers

    Science.gov (United States)

    1978-09-01

    The objective of this program was to assess the effectiveness of retrofilling an askarel transformer supplied by the United States Department of Transportation with a 50 centistokes silicone fluid. The work tasks included an assessment of the electri...

  17. When Content Matters: The Role of Processing Code in Tactile Display Design.

    Science.gov (United States)

    Ferris, Thomas K; Sarter, Nadine

    2010-01-01

    The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. Recently, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. However, it is not well known to what extent the interpretation of iconic tactile patterns is affected by another attribute of information: the information processing codes of concurrent tasks. In two driving simulation studies (n = 25 for each), participants decoded icons composed of either spatial or nonspatial patterns of vibrations (engaging spatial and nonspatial processing code resources, respectively) while concurrently interpreting spatial or nonspatial visual task stimuli. As predicted by Multiple Resource Theory, performance was significantly worse (approximately 5-10 percent worse) when the tactile icons and visual tasks engaged the same processing code, with the overall worst performance in the spatial-spatial task pairing. The findings from these studies contribute to an improved understanding of information processing and can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that competition for processing code resources warrants consideration, alongside other factors such as the naturalness of signal-message mapping, when designing iconic tactile displays. Nonspatially encoded tactile icons may be preferable in environments which already rely heavily on spatial processing, such as car cockpits.

  18. A Radiation Solver for the National Combustion Code

    Science.gov (United States)

    Sockol, Peter M.

    2015-01-01

    A methodology is given that converts an existing finite volume radiative transfer method that requires input of local absorption coefficients to one that can treat a mixture of combustion gases and compute the coefficients on the fly from the local mixture properties. The Full-spectrum k-distribution method is used to transform the radiative transfer equation (RTE) to an alternate wave number variable, g . The coefficients in the transformed equation are calculated at discrete temperatures and participating species mole fractions that span the values of the problem for each value of g. These results are stored in a table and interpolation is used to find the coefficients at every cell in the field. Finally, the transformed RTE is solved for each g and Gaussian quadrature is used to find the radiant heat flux throughout the field. The present implementation is in an existing cartesian/cylindrical grid radiative transfer code and the local mixture properties are given by a solution of the National Combustion Code (NCC) on the same grid. Based on this work the intention is to apply this method to an existing unstructured grid radiation code which can then be coupled directly to NCC.

  19. Improved side information generation for distributed video coding

    DEFF Research Database (Denmark)

    Huang, Xin; Forchhammer, Søren

    2008-01-01

    As a new coding paradigm, distributed video coding (DVC) deals with lossy source coding using side information to exploit the statistics at the decoder to reduce computational demands at the encoder. The performance of DVC highly depends on the quality of side information. With a better side...... information generation method, fewer bits will be requested from the encoder and more reliable decoded frames will be obtained. In this paper, a side information generation method is introduced to further improve the rate-distortion (RD) performance of transform domain distributed video coding. This algorithm...

  20. Design of variable-weight quadratic congruence code for optical CDMA

    Science.gov (United States)

    Feng, Gang; Cheng, Wen-Qing; Chen, Fu-Jun

    2015-09-01

    A variable-weight code family referred to as variable-weight quadratic congruence code (VWQCC) is constructed by algebraic transformation for incoherent synchronous optical code division multiple access (OCDMA) systems. Compared with quadratic congruence code (QCC), VWQCC doubles the code cardinality and provides the multiple code-sets with variable code-weight. Moreover, the bit-error rate (BER) performance of VWQCC is superior to those of conventional variable-weight codes by removing or padding pulses under the same chip power assumption. The experiment results show that VWQCC can be well applied to the OCDMA with quality of service (QoS) requirements.

  1. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  2. Articulating training methods using Job Task Analysis (JTA) - determined proficiency levels

    International Nuclear Information System (INIS)

    McDonald, B.A.

    1985-01-01

    The INPO task analysis process, as well as that of many utilities, is based on the approach used by the US Navy. This is undoubtedly due to the Navy nuclear background of many of those involved in introducing the systems approach to training to the nuclear power industry. This report outlines an approach, used by a major North-Central utility, which includes a process developed by the Air Force. Air Force task analysis and instructional system development includes the use of a proficiency code. The code includes consideration of three types of learning - task performance, task knowledge, and subject knowledge - and four levels of competence for each. The use of this classification system facilitates the identification of desired competency levels at the completion of formal training in the classroom and lab, and of informal training on the job. By using the Air Force's proficiency code. The utility's program developers were able to develop generic training for its main training facility and site-specific training at its nuclear plants, using the most efficiency and cost-effective training methods

  3. Myths and realities of rateless coding

    KAUST Repository

    Bonello, Nicholas

    2011-08-01

    Fixed-rate and rateless channel codes are generally treated separately in the related research literature and so, a novice in the field inevitably gets the impression that these channel codes are unrelated. By contrast, in this treatise, we endeavor to further develop a link between the traditional fixed-rate codes and the recently developed rateless codes by delving into their underlying attributes. This joint treatment is beneficial for two principal reasons. First, it facilitates the task of researchers and practitioners, who might be familiar with fixed-rate codes and would like to jump-start their understanding of the recently developed concepts in the rateless reality. Second, it provides grounds for extending the use of the well-understood codedesign tools-originally contrived for fixed-rate codes-to the realm of rateless codes. Indeed, these versatile tools proved to be vital in the design of diverse fixed-rate-coded communications systems, and thus our hope is that they will further elucidate the associated performance ramifications of the rateless coded schemes. © 2011 IEEE.

  4. Army Business Transformation - Next Steps

    National Research Council Canada - National Science Library

    2006-01-01

    As a follow-on to the Army Science Board 2005 Summer Study on Best Practices, the Army Science Board was tasked to identify areas where alternative approaches and application of transforming practices...

  5. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  6. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  7. Coding task performance in early adolescence: A large-scale controlled study into boy-girl differences

    Directory of Open Access Journals (Sweden)

    Sanne eDekker

    2013-08-01

    Full Text Available This study examined differences between boys and girls regarding efficiency of information processing in early adolescence. 306 healthy adolescents (50.3% boys in grade 7 and 9 (aged 13 and 15 respectively performed a coding task based on over-learned symbols. An age effect was revealed as subjects in grade 9 performed better than subjects in grade 7. Main effects for sex were found in the advantage of girls. The 25% best-performing students comprised twice as many girls as boys. The opposite pattern was found for the worst performing 25%. In addition, a main effect was found for educational track in favor of the highest track. No interaction effects were found. School grades did not explain additional variance in LDST performance. This indicates that cognitive performance is relatively independent from school performance. Student characteristics like age, sex and education level were more important for efficiency of information processing than school performance. The findings imply that after age 13, efficiency of information processing is still developing and that girls outperform boys in this respect. The findings provide new information on the mechanisms underlying boy-girl differences in scholastic performance.

  8. Parallel implementation of geometric transformations

    Energy Technology Data Exchange (ETDEWEB)

    Clarke, K A; Ip, H H.S.

    1982-10-01

    An implementation of digitized picture rotation and magnification based on Weiman's algorithm is presented. In a programmable array machine routines to perform small transformations code efficiently. The method illustrates the interpolative nature of the algorithm. 6 references.

  9. Phonologically-Based Priming in the Same-Different Task With L1 Readers.

    Science.gov (United States)

    Lupker, Stephen J; Nakayama, Mariko; Yoshihara, Masahiro

    2018-02-01

    The present experiment provides an investigation of a promising new tool, the masked priming same-different task, for investigating the orthographic coding process. Orthographic coding is the process of establishing a mental representation of the letters and letter order in the word being read which is then used by readers to access higher-level (e.g., semantic) information about that word. Prior research (e.g., Norris & Kinoshita, 2008) had suggested that performance in this task may be based entirely on orthographic codes. As reported by Lupker, Nakayama, and Perea (2015a), however, in at least some circumstances, phonological codes also play a role. Specifically, even though their 2 languages are completely different orthographically, Lupker et al.'s Japanese-English bilinguals showed priming in this task when masked L1 primes were phonologically similar to L2 targets. An obvious follow-up question is whether Lupker et al.'s effect might have resulted from a strategy that was adopted by their bilinguals to aid in processing of, and memory for, the somewhat unfamiliar L2 targets. In the present experiment, Japanese readers responded to (Japanese) Kanji targets with phonologically identical primes (on "related" trials) being presented in a completely different but highly familiar Japanese script, Hiragana. Once again, significant priming effects were observed, indicating that, although performance in the masked priming same-different task may be mainly based on orthographic codes, phonological codes can play a role even when the stimuli being matched are familiar words from a reader's L1. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Bit-wise arithmetic coding for data compression

    Science.gov (United States)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  11. Survey of nuclear fuel-cycle codes

    International Nuclear Information System (INIS)

    Thomas, C.R.; de Saussure, G.; Marable, J.H.

    1981-04-01

    A two-month survey of nuclear fuel-cycle models was undertaken. This report presents the information forthcoming from the survey. Of the nearly thirty codes reviewed in the survey, fifteen of these codes have been identified as potentially useful in fulfilling the tasks of the Nuclear Energy Analysis Division (NEAD) as defined in their FY 1981-1982 Program Plan. Six of the fifteen codes are given individual reviews. The individual reviews address such items as the funding agency, the author and organization, the date of completion of the code, adequacy of documentation, computer requirements, history of use, variables that are input and forecast, type of reactors considered, part of fuel cycle modeled and scope of the code (international or domestic, long-term or short-term, regional or national). The report recommends that the Model Evaluation Team perform an evaluation of the EUREKA uranium mining and milling code

  12. Developing a Coding Scheme to Analyse Creativity in Highly-constrained Design Activities

    DEFF Research Database (Denmark)

    Dekoninck, Elies; Yue, Huang; Howard, Thomas J.

    2010-01-01

    This work is part of a larger project which aims to investigate the nature of creativity and the effectiveness of creativity tools in highly-constrained design tasks. This paper presents the research where a coding scheme was developed and tested with a designer-researcher who conducted two rounds...... of design and analysis on a highly constrained design task. This paper shows how design changes can be coded using a scheme based on creative ‘modes of change’. The coding scheme can show the way a designer moves around the design space, and particularly the strategies that are used by a creative designer...... larger study with more designers working on different types of highly-constrained design task is needed, in order to draw conclusions on the modes of change and their relationship to creativity....

  13. Multispectral code excited linear prediction coding and its application in magnetic resonance images.

    Science.gov (United States)

    Hu, J H; Wang, Y; Cahill, P T

    1997-01-01

    This paper reports a multispectral code excited linear prediction (MCELP) method for the compression of multispectral images. Different linear prediction models and adaptation schemes have been compared. The method that uses a forward adaptive autoregressive (AR) model has been proven to achieve a good compromise between performance, complexity, and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over nonoverlapping three-dimensional (3-D) macroblocks. Each macroblock is further divided into several 3-D micro-blocks, and the best excitation signal for each microblock is determined through an analysis-by-synthesis procedure. The MFCELP method has been applied to multispectral magnetic resonance (MR) images. To satisfy the high quality requirement for medical images, the error between the original image set and the synthesized one is further specified using a vector quantizer. This method has been applied to images from 26 clinical MR neuro studies (20 slices/study, three spectral bands/slice, 256x256 pixels/band, 12 b/pixel). The MFCELP method provides a significant visual improvement over the discrete cosine transform (DCT) based Joint Photographers Expert Group (JPEG) method, the wavelet transform based embedded zero-tree wavelet (EZW) coding method, and the vector tree (VT) coding method, as well as the multispectral segmented autoregressive moving average (MSARMA) method we developed previously.

  14. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings. PMID:27242491

  15. Systemizers are better code-breakers:Self-reported systemizing predicts code-breaking performance in expert hackers and naïve participants

    Directory of Open Access Journals (Sweden)

    India eHarvey

    2016-05-01

    Full Text Available Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders - attention to detail and systemizing - may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e. crypto-analysis or code-breaking. A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001; Baron-Cohen et al., 2003. They were also tested with behavioural tasks involving code-breaking and a control task involving security x-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing was related with performance in the x-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015. We discuss the theoretical and translational implications of our findings.

  16. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    Science.gov (United States)

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  17. Mechanical Transformation of Task Heuristics into Operational Procedures

    Science.gov (United States)

    1981-04-14

    Introduction A central theme of recent research in artificial intelligence is that *Intelligent task performance requires large amounts of knowledge...PLAY P1 C4] (. (LEADING (QSO)) (OR (CAN-LEAO- HEARrS (gSO)J (mEg (SUIT-OF C3) H])] C-) (FOLLOWING (QSO)) (OR [VOID (OSO) (SUIT-LED)3 [IN-SUIT C3 (SUIT...Production rules as a representation for a knowledge based consultation system. Artificial Intelligence 8:15-45, Spring, 1977. [Davis 77b] R. Davis

  18. Banking and Financial Services Series. Duty Task List.

    Science.gov (United States)

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This document contains the occupational duty/task lists for five occupations in the banking and financial services series. Each occupation is divided into seven or eight duties. A separate page for each duty in the occupation lists the tasks in that duty along with its code number and columns to indicate whether that particular duty has been…

  19. Facial expression coding in children and adolescents with autism: Reduced adaptability but intact norm-based coding.

    Science.gov (United States)

    Rhodes, Gillian; Burton, Nichola; Jeffery, Linda; Read, Ainsley; Taylor, Libby; Ewing, Louise

    2018-05-01

    Individuals with autism spectrum disorder (ASD) can have difficulty recognizing emotional expressions. Here, we asked whether the underlying perceptual coding of expression is disrupted. Typical individuals code expression relative to a perceptual (average) norm that is continuously updated by experience. This adaptability of face-coding mechanisms has been linked to performance on various face tasks. We used an adaptation aftereffect paradigm to characterize expression coding in children and adolescents with autism. We asked whether face expression coding is less adaptable in autism and whether there is any fundamental disruption of norm-based coding. If expression coding is norm-based, then the face aftereffects should increase with adaptor expression strength (distance from the average expression). We observed this pattern in both autistic and typically developing participants, suggesting that norm-based coding is fundamentally intact in autism. Critically, however, expression aftereffects were reduced in the autism group, indicating that expression-coding mechanisms are less readily tuned by experience. Reduced adaptability has also been reported for coding of face identity and gaze direction. Thus, there appears to be a pervasive lack of adaptability in face-coding mechanisms in autism, which could contribute to face processing and broader social difficulties in the disorder. © 2017 The British Psychological Society.

  20. Automatic code generation in practice

    DEFF Research Database (Denmark)

    Adam, Marian Sorin; Kuhrmann, Marco; Schultz, Ulrik Pagh

    2016-01-01

    -specific language to specify those requirements and to allow for generating a safety-enforcing layer of code, which is deployed to the robot. The paper at hand reports experiences in practically applying code generation to mobile robots. For two cases, we discuss how we addressed challenges, e.g., regarding weaving......Mobile robots often use a distributed architecture in which software components are deployed to heterogeneous hardware modules. Ensuring the consistency with the designed architecture is a complex task, notably if functional safety requirements have to be fulfilled. We propose to use a domain...... code generation into proprietary development environments and testing of manually written code. We find that a DSL based on the same conceptual model can be used across different kinds of hardware modules, but a significant adaptation effort is required in practical scenarios involving different kinds...

  1. Fast delta Hadamard transform

    International Nuclear Information System (INIS)

    Fenimore, E.E.; Weston, G.S.

    1981-01-01

    In many fields (e.g., spectroscopy, imaging spectroscopy, photoacoustic imaging, coded aperture imaging) binary bit patterns known as m sequences are used to encode (by multiplexing) a series of measurements in order to obtain a larger throughput. The observed measurements must be decoded to obtain the desired spectrum (or image in the case of coded aperture imaging). Decoding in the past has used a technique called the fast Hadamard transform (FHT) whose chief advantage is that it can reduce the computational effort from N 2 multiplies of N log 2 N additions or subtractions. However, the FHT has the disadvantage that it does not readily allow one to sample more finely than the number of bits used in the m sequence. This can limit the obtainable resolution and cause confusion near the sample boundaries (phasing errors). Both 1-D and 2-D methods (called fast delta Hadamard transforms, FDHT) have been developed which overcome both of the above limitations. Applications of the FDHT are discussed in the context of Hadamard spectroscopy and coded aperture imaging with uniformly redundant arrays. Special emphasis has been placed on how the FDHT can unite techniques used by both of these fields into the same mathematical basis

  2. Coding of auditory temporal and pitch information by hippocampal individual cells and cell assemblies in the rat.

    Science.gov (United States)

    Sakurai, Y

    2002-01-01

    This study reports how hippocampal individual cells and cell assemblies cooperate for neural coding of pitch and temporal information in memory processes for auditory stimuli. Each rat performed two tasks, one requiring discrimination of auditory pitch (high or low) and the other requiring discrimination of their duration (long or short). Some CA1 and CA3 complex-spike neurons showed task-related differential activity between the high and low tones in only the pitch-discrimination task. However, without exception, neurons which showed task-related differential activity between the long and short tones in the duration-discrimination task were always task-related neurons in the pitch-discrimination task. These results suggest that temporal information (long or short), in contrast to pitch information (high or low), cannot be coded independently by specific neurons. The results also indicate that the two different behavioral tasks cannot be fully differentiated by the task-related single neurons alone and suggest a model of cell-assembly coding of the tasks. Cross-correlation analysis among activities of simultaneously recorded multiple neurons supported the suggested cell-assembly model.Considering those results, this study concludes that dual coding by hippocampal single neurons and cell assemblies is working in memory processing of pitch and temporal information of auditory stimuli. The single neurons encode both auditory pitches and their temporal lengths and the cell assemblies encode types of tasks (contexts or situations) in which the pitch and the temporal information are processed.

  3. Research on Primary Shielding Calculation Source Generation Codes

    Science.gov (United States)

    Zheng, Zheng; Mei, Qiliang; Li, Hui; Shangguan, Danhua; Zhang, Guangchun

    2017-09-01

    Primary Shielding Calculation (PSC) plays an important role in reactor shielding design and analysis. In order to facilitate PSC, a source generation code is developed to generate cumulative distribution functions (CDF) for the source particle sample code of the J Monte Carlo Transport (JMCT) code, and a source particle sample code is deveoped to sample source particle directions, types, coordinates, energy and weights from the CDFs. A source generation code is developed to transform three dimensional (3D) power distributions in xyz geometry to source distributions in r θ z geometry for the J Discrete Ordinate Transport (JSNT) code. Validation on PSC model of Qinshan No.1 nuclear power plant (NPP), CAP1400 and CAP1700 reactors are performed. Numerical results show that the theoretical model and the codes are both correct.

  4. Short-Term Memory Coding in Children With Intellectual Disabilities

    OpenAIRE

    Henry, L.; Conners, F.

    2008-01-01

    To examine visual and verbal coding strategies, I asked children with intellectual disabilities and peers matched for MA and CA to perform picture memory span tasks with phonologically similar, visually similar, long, or nonsimilar named items. The CA group showed effects consistent with advanced verbal memory coding (phonological similarity and word length effects). Neither the intellectual disabilities nor MA groups showed evidence for memory coding strategies. However, children in these gr...

  5. Development of transformations from business process models to implementations by reuse

    NARCIS (Netherlands)

    Dirgahayu, T.; Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis; Hammoudi, S.

    2007-01-01

    This paper presents an approach for developing transformations from business process models to implementations that facilitates reuse. A transformation is developed as a composition of three smaller tasks: pattern recognition, pattern realization and activity transformation. The approach allows one

  6. A New Class of Pulse Compression Codes and Techniques.

    Science.gov (United States)

    1980-03-26

    04 11 01 12 02 13 03 14 OA DIALFL I NOTE’ BO𔃾T TRANSFORM AND DIGITAL FILTER NETWORK INVERSE TRANSFORM DRIVE FRANK CODE SAME DIGITAL FILTER ; ! ! I I...function from circuit of Fig. I with N =9 TRANSFORM INVERSE TRANSFORM SINGLE _WORD S1A ~b,.ISR -.- ISR I- SR I--~ SR SIC-- I1GENERATOR 0 fJFJ $ J$ .. J...FOR I 1 1 13 11 12 13 FROM RECEIVER TRANSMIT Q- j ~ ~ 01 02 03 0, 02 03 11 01 12 02 13 03 4 1 1 ~ 4 NOTrE: BOTH TRANSFORM ANDI I I I INVERSE TRANSFORM DRIVE

  7. Dual Coding and Bilingual Memory.

    Science.gov (United States)

    Paivio, Allan; Lambert, Wallace

    1981-01-01

    Describes study which tested a dual coding approach to bilingual memory using tasks that permit comparison of the effects of bilingual encoding with verbal-nonverbal dual encoding items. Results provide strong support for a version of the independent or separate stories view of bilingual memory. (Author/BK)

  8. Microdosimetry computation code of internal sources - MICRODOSE 1

    International Nuclear Information System (INIS)

    Li Weibo; Zheng Wenzhong; Ye Changqing

    1995-01-01

    This paper describes a microdosimetry computation code, MICRODOSE 1, on the basis of the following described methods: (1) the method of calculating f 1 (z) for charged particle in the unit density tissues; (2) the method of calculating f(z) for a point source; (3) the method of applying the Fourier transform theory to the calculation of the compound Poisson process; (4) the method of using fast Fourier transform technique to determine f(z) and, giving some computed examples based on the code, MICRODOSE 1, including alpha particles emitted from 239 Pu in the alveolar lung tissues and from radon progeny RaA and RAC in the human respiratory tract. (author). 13 refs., 6 figs

  9. Visual search asymmetries within color-coded and intensity-coded displays.

    Science.gov (United States)

    Yamani, Yusuke; McCarley, Jason S

    2010-06-01

    Color and intensity coding provide perceptual cues to segregate categories of objects within a visual display, allowing operators to search more efficiently for needed information. Even within a perceptually distinct subset of display elements, however, it may often be useful to prioritize items representing urgent or task-critical information. The design of symbology to produce search asymmetries (Treisman & Souther, 1985) offers a potential technique for doing this, but it is not obvious from existing models of search that an asymmetry observed in the absence of extraneous visual stimuli will persist within a complex color- or intensity-coded display. To address this issue, in the current study we measured the strength of a visual search asymmetry within displays containing color- or intensity-coded extraneous items. The asymmetry persisted strongly in the presence of extraneous items that were drawn in a different color (Experiment 1) or a lower contrast (Experiment 2) than the search-relevant items, with the targets favored by the search asymmetry producing highly efficient search. The asymmetry was attenuated but not eliminated when extraneous items were drawn in a higher contrast than search-relevant items (Experiment 3). Results imply that the coding of symbology to exploit visual search asymmetries can facilitate visual search for high-priority items even within color- or intensity-coded displays. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. The role of social cues in the deployment of spatial attention: Head-body relationships automatically activate directional spatial codes in a Simon task

    Directory of Open Access Journals (Sweden)

    Iwona ePomianowska

    2012-02-01

    Full Text Available The role of body orientation in the orienting and allocation of social attention was examined using an adapted Simon paradigm. Participants categorized the facial expression of forward facing, computer-generated human figures by pressing one of two response keys, each located left or right of the observers’ body midline, while the orientation of the stimulus figure’s body (trunk, arms, and legs, which was the task-irrelevant feature of interest, was manipulated (oriented towards the left or right visual hemifield with respect to the spatial location of the required response. We found that when the orientation of the body was compatible with the required response location, responses were slower relative to when body orientation was incompatible with the response location. This reverse compatibility effect suggests that body orientation is automatically processed into a directional spatial code, but that this code is based on an integration of head and body orientation within an allocentric-based frame of reference. Moreover, we argue that this code may be derived from the motion information implied in the image of a figure when head and body orientation are incongruent. Our results have implications for understanding the nature of the information that affects the allocation of attention for social orienting.

  11. Short-Term Memory Coding in Children with Intellectual Disabilities

    Science.gov (United States)

    Henry, Lucy

    2008-01-01

    To examine visual and verbal coding strategies, I asked children with intellectual disabilities and peers matched for MA and CA to perform picture memory span tasks with phonologically similar, visually similar, long, or nonsimilar named items. The CA group showed effects consistent with advanced verbal memory coding (phonological similarity and…

  12. Large-Signal Code TESLA: Improvements in the Implementation and in the Model

    National Research Council Canada - National Science Library

    Chernyavskiy, Igor A; Vlasov, Alexander N; Anderson, Jr., Thomas M; Cooke, Simon J; Levush, Baruch; Nguyen, Khanh T

    2006-01-01

    We describe the latest improvements made in the large-signal code TESLA, which include transformation of the code to a Fortran-90/95 version with dynamical memory allocation and extension of the model...

  13. Arabic Natural Language Processing System Code Library

    Science.gov (United States)

    2014-06-01

    Adelphi, MD 20783-1197 This technical note provides a brief description of a Java library for Arabic natural language processing ( NLP ) containing code...for training and applying the Arabic NLP system described in the paper "A Cross-Task Flexible Transition Model for Arabic Tokenization, Affix...and also English) natural language processing ( NLP ), containing code for training and applying the Arabic NLP system described in Stephen Tratz’s

  14. Feasibility of EPC to BPEL Model Transformations based on Ontology and Patterns

    NARCIS (Netherlands)

    Meertens, Lucas O.; Iacob, Maria Eugenia; Eckartz, Silja M.; Rinderle-Ma, Stefanie; Sadiq, Shazia; Leymann, Frank

    2010-01-01

    Model-Driven Engineering holds the promise of transforming business models into code automatically. This requires the concept of model transformation. In this paper, we assess the feasibility of model transformations from Event-driven Process Chain models to Business Process Execution Language

  15. International building code for bamboo

    NARCIS (Netherlands)

    Janssen, J.J.A.; Kumar, Arun; Ramanuja Rao, I.V.; Sastry, Cherla

    2002-01-01

    One of the recommendations in the International Bamboo Congress and Workshop, held at Bali in 1995, requested the International Network for Bamboo and Rattan (INBAR), "to organize a task force to discuss and finalize a building code for bamboo". Consequently a draft was prepared under the title, "An

  16. Coding and transmission of subband coded images on the Internet

    Science.gov (United States)

    Wah, Benjamin W.; Su, Xiao

    2001-09-01

    Subband-coded images can be transmitted in the Internet using either the TCP or the UDP protocol. Delivery by TCP gives superior decoding quality but with very long delays when the network is unreliable, whereas delivery by UDP has negligible delays but with degraded quality when packets are lost. Although images are delivered currently over the Internet by TCP, we study in this paper the use of UDP to deliver multi-description reconstruction-based subband-coded images. First, in order to facilitate recovery from UDP packet losses, we propose a joint sender-receiver approach for designing optimized reconstruction-based subband transform (ORB-ST) in multi-description coding (MDC). Second, we carefully evaluate the delay-quality trade-offs between the TCP delivery of SDC images and the UDP and combined TCP/UDP delivery of MDC images. Experimental results show that our proposed ORB-ST performs well in real Internet tests, and UDP and combined TCP/UDP delivery of MDC images provide a range of attractive alternatives to TCP delivery.

  17. Joint Source-Channel Coding by Means of an Oversampled Filter Bank Code

    Directory of Open Access Journals (Sweden)

    Marinkovic Slavica

    2006-01-01

    Full Text Available Quantized frame expansions based on block transforms and oversampled filter banks (OFBs have been considered recently as joint source-channel codes (JSCCs for erasure and error-resilient signal transmission over noisy channels. In this paper, we consider a coding chain involving an OFB-based signal decomposition followed by scalar quantization and a variable-length code (VLC or a fixed-length code (FLC. This paper first examines the problem of channel error localization and correction in quantized OFB signal expansions. The error localization problem is treated as an -ary hypothesis testing problem. The likelihood values are derived from the joint pdf of the syndrome vectors under various hypotheses of impulse noise positions, and in a number of consecutive windows of the received samples. The error amplitudes are then estimated by solving the syndrome equations in the least-square sense. The message signal is reconstructed from the corrected received signal by a pseudoinverse receiver. We then improve the error localization procedure by introducing a per-symbol reliability information in the hypothesis testing procedure of the OFB syndrome decoder. The per-symbol reliability information is produced by the soft-input soft-output (SISO VLC/FLC decoders. This leads to the design of an iterative algorithm for joint decoding of an FLC and an OFB code. The performance of the algorithms developed is evaluated in a wavelet-based image coding system.

  18. User's guide to the biosphere code ECOS

    International Nuclear Information System (INIS)

    Kane, P.; Thorne, M.C.

    1984-10-01

    This report constitutes the user's guide to the biosphere model ECOS and provides a detailed description of the processes modelled and mathematical formulations used. The FORTRAN code ECOS is an equilibrium-type compartmental biosphere code. ECOS was designed with the objective of producing a general but comprehensive code for use in the assessment of the radiological impact of unspecified geological repositories for radioactive waste. ECOS transforms the rate of release of activity from the geosphere to the rate of accumulation of weighted committed effective dose equivalent (dose). Both maximum individual dose (critical group dose) and collective dose rates may be computed. (author)

  19. A discrete Fourier transform for virtual memory machines

    Science.gov (United States)

    Galant, David C.

    1992-01-01

    An algebraic theory of the Discrete Fourier Transform is developed in great detail. Examination of the details of the theory leads to a computationally efficient fast Fourier transform for the use on computers with virtual memory. Such an algorithm is of great use on modern desktop machines. A FORTRAN coded version of the algorithm is given for the case when the sequence of numbers to be transformed is a power of two.

  20. Reactor Systems Technology Division code development and configuration/quality control procedures

    International Nuclear Information System (INIS)

    Johnson, E.C.

    1985-06-01

    Procedures are prescribed for executing a code development task and implementing the resulting coding in an official version of a computer code. The responsibilities of the project manager, development staff members, and the Code Configuration/Quality Control Group are defined. Examples of forms, logs, computer job control language, and suggested outlines for reports associated with software production and implementation are included in Appendix A. 1 raf., 2 figs

  1. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  2. QR Codes as Finding Aides: Linking Electronic and Print Library Resources

    Science.gov (United States)

    Kane, Danielle; Schneidewind, Jeff

    2011-01-01

    As part of a focused, methodical, and evaluative approach to emerging technologies, QR codes are one of many new technologies being used by the UC Irvine Libraries. QR codes provide simple connections between print and virtual resources. In summer 2010, a small task force began to investigate how QR codes could be used to provide information and…

  3. Set-based Tasks within the Singularity-robust Multiple Task-priority Inverse Kinematics Framework: General Formulation, Stability Analysis and Experimental Results

    Directory of Open Access Journals (Sweden)

    Signe eMoe

    2016-04-01

    Full Text Available Inverse kinematics algorithms are commonly used in robotic systems to transform tasks to joint references, and several methods exist to ensure the achievement of several tasks simultaneously. The multiple task-priority inverse kinematicsframework allows tasks to be considered in a prioritized order by projecting task velocities through the nullspaces of higherpriority tasks. This paper extends this framework to handle setbased tasks, i.e. tasks with a range of valid values, in addition to equality tasks, which have a specific desired value. Examples of set-based tasks are joint limit and obstacle avoidance. The proposed method is proven to ensure asymptotic convergence of the equality task errors and the satisfaction of all high-priority set-based tasks. The practical implementation of the proposed algorithm is discussed, and experimental results are presented where a number of both set-based and equality tasks have been implemented on a 6 degree of freedom UR5 which is an industrial robotic arm from Universal Robots. The experiments validate thetheoretical results and confirm the effectiveness of the proposed approach.

  4. Synthesizing Modular Invariants for Synchronous Code

    Directory of Open Access Journals (Sweden)

    Pierre-Loic Garoche

    2014-12-01

    Full Text Available In this paper, we explore different techniques to synthesize modular invariants for synchronous code encoded as Horn clauses. Modular invariants are a set of formulas that characterizes the validity of predicates. They are very useful for different aspects of analysis, synthesis, testing and program transformation. We describe two techniques to generate modular invariants for code written in the synchronous dataflow language Lustre. The first technique directly encodes the synchronous code in a modular fashion. While in the second technique, we synthesize modular invariants starting from a monolithic invariant. Both techniques, take advantage of analysis techniques based on property-directed reachability. We also describe a technique to minimize the synthesized invariants.

  5. A Transform-Based Feature Extraction Approach for Motor Imagery Tasks Classification

    Science.gov (United States)

    Khorshidtalab, Aida; Mesbah, Mostefa; Salami, Momoh J. E.

    2015-01-01

    In this paper, we present a new motor imagery classification method in the context of electroencephalography (EEG)-based brain–computer interface (BCI). This method uses a signal-dependent orthogonal transform, referred to as linear prediction singular value decomposition (LP-SVD), for feature extraction. The transform defines the mapping as the left singular vectors of the LP coefficient filter impulse response matrix. Using a logistic tree-based model classifier; the extracted features are classified into one of four motor imagery movements. The proposed approach was first benchmarked against two related state-of-the-art feature extraction approaches, namely, discrete cosine transform (DCT) and adaptive autoregressive (AAR)-based methods. By achieving an accuracy of 67.35%, the LP-SVD approach outperformed the other approaches by large margins (25% compared with DCT and 6 % compared with AAR-based methods). To further improve the discriminatory capability of the extracted features and reduce the computational complexity, we enlarged the extracted feature subset by incorporating two extra features, namely, Q- and the Hotelling’s \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$T^{2}$ \\end{document} statistics of the transformed EEG and introduced a new EEG channel selection method. The performance of the EEG classification based on the expanded feature set and channel selection method was compared with that of a number of the state-of-the-art classification methods previously reported with the BCI IIIa competition data set. Our method came second with an average accuracy of 81.38%. PMID:27170898

  6. Transformation optics and invisibility cloaks

    DEFF Research Database (Denmark)

    Qiu, Min; Yan, Min; Yan, Wei

    2008-01-01

    In this paper, we briefly summarize the theory of transformation optics and introduce its application in achieving perfect invisibility cloaking. In particular, we theoretically show how the task of realizing cylindrical invisibility cloaks can be eased by using either structural approximation...

  7. Aespoe Hard Rock Laboratory. Aespoe Task Force on Engineered Barrier System. Modelling of THM-coupled processes for benchmark 2.2 with the code GeoSys/RockFlow

    International Nuclear Information System (INIS)

    Nowak, Thomas; Kunz, Herbert

    2010-02-01

    In 2004 the Swedish Nuclear Fuel and Waste Management Co. (SKB) initiated the project 'Task Force on Engineered Barrier Systems'. This project has the objective to verify the feasibility of modelling THM-coupled processes (task 1) and gas migration processes (task 2) in clay-rich buffer materials. The tasks are performed on the basis of appropriate benchmarks. This report documents the modelling results of the THM-benchmark 2.2 - the Canister Retrieval Test - using the code GeoSys/RockFlow. The Temperature Buffer Test which was performed in the immediate vicinity of the Canister Retrieval Test is included in the model. Especially the heat transport requires the handling of the problem in 3-D. Due to limitations imposed by post-processing different spatial discretisations of the model had to be used during the processing of the benchmark. The calculated temperatures agree well with measured data. Concerning hydraulic parameters the values of permeability and tortuosity were varied in the calculations. The time necessary to saturate the buffer is very sensitive to both of these values. In comparison to thermal and hydraulic processes the model only has limited capacity to predict the measured evolution of total pressure

  8. Aespoe Hard Rock Laboratory. Aespoe Task Force on Engineered Barrier System. Modelling of THM-coupled processes for benchmark 2.2 with the code GeoSys/RockFlow

    Energy Technology Data Exchange (ETDEWEB)

    Nowak, Thomas; Kunz, Herbert (Federal Inst. for Geosciences and Natural Resources, Hannover (Germany))

    2010-02-15

    In 2004 the Swedish Nuclear Fuel and Waste Management Co. (SKB) initiated the project 'Task Force on Engineered Barrier Systems'. This project has the objective to verify the feasibility of modelling THM-coupled processes (task 1) and gas migration processes (task 2) in clay-rich buffer materials. The tasks are performed on the basis of appropriate benchmarks. This report documents the modelling results of the THM-benchmark 2.2 - the Canister Retrieval Test - using the code GeoSys/RockFlow. The Temperature Buffer Test which was performed in the immediate vicinity of the Canister Retrieval Test is included in the model. Especially the heat transport requires the handling of the problem in 3-D. Due to limitations imposed by post-processing different spatial discretisations of the model had to be used during the processing of the benchmark. The calculated temperatures agree well with measured data. Concerning hydraulic parameters the values of permeability and tortuosity were varied in the calculations. The time necessary to saturate the buffer is very sensitive to both of these values. In comparison to thermal and hydraulic processes the model only has limited capacity to predict the measured evolution of total pressure

  9. Developing communicative competence through thinking tasks

    DEFF Research Database (Denmark)

    Maslo, Elina

    Developing communicative competence through thinking tasks - Experimenting with Thinking Approach in Danish as Second Language ClassroomSession on Innovations in the classroom, a presentation. Abstract for the conference Creativity & Thinking Skills in Learning, teaching & Management. Riga 19......-20 September 2014 Elina Maslo, Aarhus University, Department of Education, elma@edu.au.dk Summary: The goal of this presentation is to present some of the experiences with thinking tasks in the Danish language classroom, conducted in the Nordplus Nordic Language Project “Problem solving tasks for learning...... of Danish as second and foreign language in transformative learning spaces”. Two teachers have developed and tried out some thinking tasks in their classrooms, with the aim to foster the development of students´ communicative competence. The learning processes from two classrooms will be analysed...

  10. The Fastest Fourier Transform in the West

    National Research Council Canada - National Science Library

    Frigo, Matteo; Johnson, Steven G

    1997-01-01

    .... Three main ideas are the keys to FFTW's performance. First, the computation of the transform is performed by an executor consisting of highly-optimized, composable blocks of C code called codelets...

  11. Quantum Fourier Transform Over Galois Rings

    OpenAIRE

    Zhang, Yong

    2009-01-01

    Galois rings are regarded as "building blocks" of a finite commutative ring with identity. There have been many papers on classical error correction codes over Galois rings published. As an important warm-up before exploring quantum algorithms and quantum error correction codes over Galois rings, we study the quantum Fourier transform (QFT) over Galois rings and prove it can be efficiently preformed on a quantum computer. The properties of the QFT over Galois rings lead to the quantum algorit...

  12. A survey and comparison of transformation tools based on the transformation tool contest

    NARCIS (Netherlands)

    Jakumeit, E.; Van Gorp, P.; Buchwald, S.; Rose, L.; Wagelaar, D.; Dan, L.; Hegedüs, Á; Hermannsdörfer, M.; Horn, T.; Kalnina, E.; Krause, C.; Lano, K.; Lepper, M.; Rensink, Arend; Rose, L.M.; Wätzoldt, S.; Mazanek, S.

    Model transformation is one of the key tasks in model-driven engineering and relies on the efficient matching and modification of graph-based data structures; its sibling graph rewriting has been used to successfully model problems in a variety of domains. Over the last years, a wide range of graph

  13. Simplifying the parallelization of scientific codes by a function-centric approach in Python

    International Nuclear Information System (INIS)

    Nilsen, Jon K; Cai Xing; Langtangen, Hans Petter; Hoeyland, Bjoern

    2010-01-01

    The purpose of this paper is to show how existing scientific software can be parallelized using a separate thin layer of Python code where all parallelization-specific tasks are implemented. We provide specific examples of such a Python code layer, which can act as templates for parallelizing a wide set of serial scientific codes. The use of Python for parallelization is motivated by the fact that the language is well suited for reusing existing serial codes programmed in other languages. The extreme flexibility of Python with regard to handling functions makes it very easy to wrap up decomposed computational tasks of a serial scientific application as Python functions. Many parallelization-specific components can be implemented as generic Python functions, which may take as input those wrapped functions that perform concrete computational tasks. The overall programming effort needed by this parallelization approach is limited, and the resulting parallel Python scripts have a compact and clean structure. The usefulness of the parallelization approach is exemplified by three different classes of application in natural and social sciences.

  14. Identifying and acting on potentially inappropriate care? Inadequacy of current hospital coding for this task.

    Science.gov (United States)

    Cooper, P David; Smart, David R

    2017-06-01

    Recent Australian attempts to facilitate disinvestment in healthcare, by identifying instances of 'inappropriate' care from large Government datasets, are subject to significant methodological flaws. Amongst other criticisms has been the fact that the Government datasets utilized for this purpose correlate poorly with datasets collected by relevant professional bodies. Government data derive from official hospital coding, collected retrospectively by clerical personnel, whilst professional body data derive from unit-specific databases, collected contemporaneously with care by clinical personnel. Assessment of accuracy of official hospital coding data for hyperbaric services in a tertiary referral hospital. All official hyperbaric-relevant coding data submitted to the relevant Australian Government agencies by the Royal Hobart Hospital, Tasmania, Australia for financial year 2010-2011 were reviewed and compared against actual hyperbaric unit activity as determined by reference to original source documents. Hospital coding data contained one or more errors in diagnoses and/or procedures in 70% of patients treated with hyperbaric oxygen that year. Multiple discrete error types were identified, including (but not limited to): missing patients; missing treatments; 'additional' treatments; 'additional' patients; incorrect procedure codes and incorrect diagnostic codes. Incidental observations of errors in surgical, anaesthetic and intensive care coding within this cohort suggest that the problems are not restricted to the specialty of hyperbaric medicine alone. Publications from other centres indicate that these problems are not unique to this institution or State. Current Government datasets are irretrievably compromised and not fit for purpose. Attempting to inform the healthcare policy debate by reference to these datasets is inappropriate. Urgent clinical engagement with hospital coding departments is warranted.

  15. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  16. The ASME Code today -- Challenges, threats, opportunities

    International Nuclear Information System (INIS)

    Canonico, D.A.

    1995-01-01

    Since its modest beginning as a single volume in 1914 the ASME Code, or some of its parts, is recognized today in 48 of the United States and all providence's of Canada. The ASME Code today is composed of 25 books including two Code Case books. These books cover the new construction of boilers and pressure vessels and the new construction and In-Service-Inspection of Nuclear Power Plant components. The ASME accredits all manufacturers of boilers and pressure vessels built to the ASME Code. There are approximately 7650 symbol stamps issued throughout the world. Over 23% of the symbol stamps have been issued outside the USA and Canada. The challenge to the ASME Code is to be accepted as the world standard for pressure boundary components. There are activities underway to achieve that goal. The ASME Code is being revised to make it a more friendly document to entities outside of North America. To achieve that end there are specific tasks underway which are described here

  17. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 8: Cooling Flow/heat Transfer Analysis User's Manual

    Science.gov (United States)

    Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.

    1994-01-01

    The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. This user's manual describes how to use the ADPAC code as developed in Task 5, NAS3-25270, including the modifications made to date in Tasks 7 and 8, NAS3-25270.

  18. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    Science.gov (United States)

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  19. Anthropomorphic Coding of Speech and Audio: A Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    W. Bastiaan Kleijn

    2005-06-01

    Full Text Available Auditory modeling is a well-established methodology that provides insight into human perception and that facilitates the extraction of signal features that are most relevant to the listener. The aim of this paper is to provide a tutorial on perceptual speech and audio coding using an invertible auditory model. In this approach, the audio signal is converted into an auditory representation using an invertible auditory model. The auditory representation is quantized and coded. Upon decoding, it is then transformed back into the acoustic domain. This transformation converts a complex distortion criterion into a simple one, thus facilitating quantization with low complexity. We briefly review past work on auditory models and describe in more detail the components of our invertible model and its inversion procedure, that is, the method to reconstruct the signal from the output of the auditory model. We summarize attempts to use the auditory representation for low-bit-rate coding. Our approach also allows the exploitation of the inherent redundancy of the human auditory system for the purpose of multiple description (joint source-channel coding.

  20. Code Generation from Pragmatics Annotated Coloured Petri Nets

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge

    limited work has been done on transforming CPN model to protocol implementations. The goal of the thesis is to be able to automatically generate high-quality implementations of communication protocols based on CPN models. In this thesis, we develop a methodology for generating implementations of protocols...... third party libraries and the code should be easily usable by third party code. Finally, the code should be readable by developers with expertise on the considered platforms. In this thesis, we show that our code generation approach is able to generate code for a wide range of platforms without altering...... such as games and rich web applications. Finally, we conclude the evaluation of the criteria of our approach by using the WebSocket PA-CPN model to show that we are able to verify fairly large protocols....

  1. Interference with olfactory memory by visual and verbal tasks.

    Science.gov (United States)

    Annett, J M; Cook, N M; Leslie, J C

    1995-06-01

    It has been claimed that olfactory memory is distinct from memory in other modalities. This study investigated the effectiveness of visual and verbal tasks in interfering with olfactory memory and included methodological changes from other recent studies. Subjects were allocated to one of four experimental conditions involving interference tasks [no interference task; visual task; verbal task; visual-plus-verbal task] and presented 15 target odours. Either recognition of the odours or free recall of the odour names was tested on one occasion, either within 15 minutes of presentation or one week later. Recognition and recall performance both showed effects of interference of visual and verbal tasks but there was no effect for time of testing. While the results may be accommodated within a dual coding framework, further work is indicated to resolve theoretical issues relating to task complexity.

  2. Where Good Pedagogical Ideas Come From: The Story of an EAP Task

    Science.gov (United States)

    Light, Justine; Ranta, Leila

    2016-01-01

    Teachers using a task-based language teaching (TBLT) approach are always searching for learning tasks that have the potential to prepare learners for the real world. In this article, we describe how an authentic academic assignment for graduate students in a teaching English as a second language (TESL) course was transformed into a task-based…

  3. Short-term memory coding in children with intellectual disabilities.

    Science.gov (United States)

    Henry, Lucy

    2008-05-01

    To examine visual and verbal coding strategies, I asked children with intellectual disabilities and peers matched for MA and CA to perform picture memory span tasks with phonologically similar, visually similar, long, or nonsimilar named items. The CA group showed effects consistent with advanced verbal memory coding (phonological similarity and word length effects). Neither the intellectual disabilities nor MA groups showed evidence for memory coding strategies. However, children in these groups with MAs above 6 years showed significant visual similarity and word length effects, broadly consistent with an intermediate stage of dual visual and verbal coding. These results suggest that developmental progressions in memory coding strategies are independent of intellectual disabilities status and consistent with MA.

  4. The Development of the World Anti-Doping Code.

    Science.gov (United States)

    Young, Richard

    2017-01-01

    This chapter addresses both the development and substance of the World Anti-Doping Code, which came into effect in 2003, as well as the subsequent Code amendments, which came into effect in 2009 and 2015. Through an extensive process of stakeholder input and collaboration, the World Anti-Doping Code has transformed the hodgepodge of inconsistent and competing pre-2003 anti-doping rules into a harmonized and effective approach to anti-doping. The Code, as amended, is now widely recognized worldwide as the gold standard in anti-doping. The World Anti-Doping Code originally went into effect on January 1, 2004. The first amendments to the Code went into effect on January 1, 2009, and the second amendments on January 1, 2015. The Code and the related international standards are the product of a long and collaborative process designed to make the fight against doping more effective through the adoption and implementation of worldwide harmonized rules and best practices. © 2017 S. Karger AG, Basel.

  5. Efficient MPEG-2 to H.264/AVC Transcoding of Intra-Coded Video

    Directory of Open Access Journals (Sweden)

    Vetro Anthony

    2007-01-01

    Full Text Available This paper presents an efficient transform-domain architecture and corresponding mode decision algorithms for transcoding intra-coded video from MPEG-2 to H.264/AVC. Low complexity is achieved in several ways. First, our architecture employs direct conversion of the transform coefficients, which eliminates the need for the inverse discrete cosine transform (DCT and forward H.264/AVC transform. Then, within this transform-domain architecture, we perform macroblock-based mode decisions based on H.264/AVC transform coefficients, which is possible using a novel method of calculating distortion in the transform domain. The proposed method for distortion calculation could be used to make rate-distortion optimized mode decisions with lower complexity. Compared to the pixel-domain architecture with rate-distortion optimized mode decision, simulation results show that there is a negligible loss in quality incurred by the direct conversion of transform coefficients and the proposed transform-domain mode decision algorithms, while complexity is significantly reduced. To further reduce the complexity, we also propose two fast mode decision algorithms. The first algorithm ranks modes based on a simple cost function in the transform domain, then computes the rate-distortion optimal mode from a reduced set of ranked modes. The second algorithm exploits temporal correlations in the mode decision between temporally adjacent frames. Simulation results show that these algorithms provide additional computational savings over the proposed transform-domain architecture while maintaining virtually the same coding efficiency.

  6. Malignant transformation of colonic epithelial cells by a colon-derived long noncoding RNA

    International Nuclear Information System (INIS)

    Franklin, Jeffrey L.; Rankin, Carl R.; Levy, Shawn; Snoddy, Jay R.; Zhang, Bing; Washington, Mary Kay; Thomson, J. Michael; Whitehead, Robert H.; Coffey, Robert J.

    2013-01-01

    Highlights: •Non-coding RNAs are found in the colonic crypt progenitor compartment. •Colonocytes transformed by ncNRFR are highly invasive and metastatic. •ncNRFR has a region similar to the miRNA, let-7 family. •ncNRFR expression alters let-7 activity as measured by reporter construct. •ncNRFR expression upregulates let-7b targets. -- Abstract: Recent progress has been made in the identification of protein-coding genes and miRNAs that are expressed in and alter the behavior of colonic epithelia. However, the role of long non-coding RNAs (lncRNAs) in colonic homeostasis is just beginning to be explored. By gene expression profiling of post-mitotic, differentiated tops and proliferative, progenitor-compartment bottoms of microdissected adult mouse colonic crypts, we identified several lncRNAs more highly expressed in crypt bottoms. One identified lncRNA, designated non-coding Nras functional RNA (ncNRFR), resides within the Nras locus but appears to be independent of the Nras coding transcript. Stable overexpression of ncNRFR in non-transformed, conditionally immortalized mouse colonocytes results in malignant transformation, as determined by growth in soft agar and formation of highly invasive tumors in nude mice. Moreover, ncNRFR appears to inhibit the function of the tumor suppressor let-7. These results suggest precise regulation of ncNRFR is necessary for proper cell growth in the colonic crypt, and its misregulation results in neoplastic transformation

  7. The mathematical model of the task of compiling the time-table

    Directory of Open Access Journals (Sweden)

    О.Є. Литвиненко

    2004-01-01

    Full Text Available  The mathematical model of the task of compiling the time-table in High-school has been carried out.  It has been showed, that the task may be reduced to canonical form of extrimal combinatorial tasks with unlinear structure after identical transformations. The algorithm of the task’s decision for realizing the scheme of the directed sorting of variants is indicated.

  8. Code-switching in multilingual aphasia

    Directory of Open Access Journals (Sweden)

    Peggy S. Conner

    2014-04-01

    Results: Spanish: Post-Dutch intervention we noted an increased frequency of CS in both the action-description (25.4% and answering-questions tasks (20.2%; this was accounted for largely by the relative change in whole-word CS type (action-description: 21.1% vs. 4.4% within and answering-questions: 37.3% vs. -17.1% within. Following Russian intervention, an increase in CS was evident, 50.8% for action description and 40.9% for answering questions, reflective of a within-word CS increase (action-description: 39.7% and more evenly distributed for the latter task. For CS, the non-target language was primarily Italian. Norwegian: Following Dutch treatment, the proportion of code-switched responses remained the same for action description and answering questions. The type of CS changed with a decrease in whole-word CS (-19.1% for the answering-questions task. Post-Russian intervention, no production differences were noted in CS frequency for action description, although within-word type increased (16.7%. On the answering-questions task, CS frequency decreased (-33%, particularly whole-word CS (-26%. Across baselines non-target whole- and within-word transfer included English, German, Danish, Swedish and Dutch. Discussion: We suggest the increase in CS in Spanish following treatment in Dutch and post-Russian language introduction likely reflects increased lexical inhibition. In contrast, in Norwegian, the proportion of code-switched responses decreased or remained the same and the proportion of within-word CS increased, suggestive of improved lexical access following both interventions. The contrasting effect (see Figure 1 may be due to relative proficiency (greater in Spanish than in Norwegian or the differential lexical similarities between the target language and the language of intervention.

  9. Offshore code comparison collaboration continuation (OC4), phase I - Results of coupled simulations of an offshore wind turbine with jacket support structure

    DEFF Research Database (Denmark)

    Popko, Wojciech; Vorpahl, Fabian; Zuga, Adam

    2012-01-01

    In this paper, the exemplary results of the IEA Wind Task 30 "Offshore Code Comparison Collaboration Continuation" (OC4) Project - Phase I, focused on the coupled simulation of an offshore wind turbine (OWT) with a jacket support structure, are presented. The focus of this task has been the verif......In this paper, the exemplary results of the IEA Wind Task 30 "Offshore Code Comparison Collaboration Continuation" (OC4) Project - Phase I, focused on the coupled simulation of an offshore wind turbine (OWT) with a jacket support structure, are presented. The focus of this task has been...... the verification of OWT modeling codes through code-to-code comparisons. The discrepancies between the results are shown and the sources of the differences are discussed. The importance of the local dynamics of the structure is depicted in the simulation results. Furthermore, attention is given to aspects...

  10. A comparison of the effects of a secondary task and lorazepam on cognitive performance.

    Science.gov (United States)

    File, S E

    1992-01-01

    In order to test whether the lorazepam-induced impairments in a variety of cognitive tasks were similar to those of divided attention, the effects of lorazepam (2.5 mg) in healthy volunteers were compared with those requiring subjects to perform an additional task (detecting silences superimposed onto classical music). Neither treatment impaired implicit memory or judgements of frequency. Both treatments impaired performance in tests of speed, lorazepam having the greatest effect on number cancellation and the additional task having the greatest effect on simple reaction time. Both treatments impaired performance in a coding task, in a test of explicit episodic memory and in judgements of recency (indicating impaired coding of contextual information). Lorazepam significantly reduced performance in a word completion task, but this was unimpaired in the group performing the additional task. In general, the pattern of results suggests that there are similarities between the effects of divided attention and lorazepam treatment, and that lorazepam-induced cognitive impairments are not restricted to explicit tests of episodic memory.

  11. Implementation of JAERI's reflood model into TRAC-PF1/MOD1 code

    International Nuclear Information System (INIS)

    Akimoto, Hajime; Ohnuki, Akira; Murao, Yoshio

    1993-02-01

    Selected physical models of REFLA code, that is a reflood analysis code developed at JAERI, were implemented into the TRAC-PF1/MOD1 code in order to improve the predictive capability of the TRAC-PF1/MOD1 code for the core thermal hydraulic behaviors during the reflood phase in a PWR LOCA. Through comparisons of physical models between both codes, (1) Murao-Iguchi void fraction correlation, (2) the drag coefficient correlation acting to drops, (3) the correlation for wall heat transfer coefficient in the film boiling regime, (4) the quench velocity correlation and (5) heat transfer correlations for the dispersed flow regime were selected from the REFLA code to be implemented into the TRAC-PF1/MOD1 code. A method for the transformation of the void fraction correlation to the equivalent interfacial friction model was developed and the effect of the transformation method on the stability of the solution was discussed. Through assessment calculation using data from CCTF (Cylindrical Core Test Facility) flat power test, it was confirmed that the predictive capability of the TRAC code for the core thermal hydraulic behaviors during the reflood can be improved by the implementation of selected physical models of the REFLA code. Several user guidelines for the modified TRAC code were proposed based on the sensitivity studies on fluid cell number in the hydraulic calculation and on node number and effect of axial heat conduction in the heat conduction calculation of fuel rod. (author)

  12. Considerations Concerning Matrix Diagram Transformations Associated with Mathematical Model Study of a Three-phase Transformer

    Directory of Open Access Journals (Sweden)

    Mihaela Poienar

    2014-09-01

    Full Text Available The clock hour figure mathematical model of a threephase transformer can be expressed, in the most plain form, through a 3X3 square matrix, called code matrix. The lines position reflect the modification in the high voltage windings terminal and the columns position reflect the modification in the low voltage winding terminal. The main changes on the transformer winding terminal are: the circular permutation of connection between windings; terminal supply reversal; reverse direction for the phase winding wrapping; reversal the beginning with the end for a phase winding; the connection conversion from N in Z between phase winding or inverse. The analytical form of these changes actually affect the configuration of the mathematical model expressed through a transformations diagram proposed and analyzed in two ways: bipolar version and unipolar version (fanwise. In the end of the paper are presented about the practical exploitation of the transformations diagram.

  13. Unified transform architecture for AVC, AVS, VC-1 and HEVC high-performance codecs

    Science.gov (United States)

    Dias, Tiago; Roma, Nuno; Sousa, Leonel

    2014-12-01

    A unified architecture for fast and efficient computation of the set of two-dimensional (2-D) transforms adopted by the most recent state-of-the-art digital video standards is presented in this paper. Contrasting to other designs with similar functionality, the presented architecture is supported on a scalable, modular and completely configurable processing structure. This flexible structure not only allows to easily reconfigure the architecture to support different transform kernels, but it also permits its resizing to efficiently support transforms of different orders (e.g. order-4, order-8, order-16 and order-32). Consequently, not only is it highly suitable to realize high-performance multi-standard transform cores, but it also offers highly efficient implementations of specialized processing structures addressing only a reduced subset of transforms that are used by a specific video standard. The experimental results that were obtained by prototyping several configurations of this processing structure in a Xilinx Virtex-7 FPGA show the superior performance and hardware efficiency levels provided by the proposed unified architecture for the implementation of transform cores for the Advanced Video Coding (AVC), Audio Video coding Standard (AVS), VC-1 and High Efficiency Video Coding (HEVC) standards. In addition, such results also demonstrate the ability of this processing structure to realize multi-standard transform cores supporting all the standards mentioned above and that are capable of processing the 8k Ultra High Definition Television (UHDTV) video format (7,680 × 4,320 at 30 fps) in real time.

  14. Development of RESRAD probabilistic computer codes for NRC decommissioning and license termination applications

    International Nuclear Information System (INIS)

    Chen, S. Y.; Yu, C.; Mo, T.; Trottier, C.

    2000-01-01

    In 1999, the US Nuclear Regulatory Commission (NRC) tasked Argonne National Laboratory to modify the existing RESRAD and RESRAD-BUILD codes to perform probabilistic, site-specific dose analysis for use with the NRC's Standard Review Plan for demonstrating compliance with the license termination rule. The RESRAD codes have been developed by Argonne to support the US Department of Energy's (DOEs) cleanup efforts. Through more than a decade of application, the codes already have established a large user base in the nation and a rigorous QA support. The primary objectives of the NRC task are to: (1) extend the codes' capabilities to include probabilistic analysis, and (2) develop parameter distribution functions and perform probabilistic analysis with the codes. The new codes also contain user-friendly features specially designed with graphic-user interface. In October 2000, the revised RESRAD (version 6.0) and RESRAD-BUILD (version 3.0), together with the user's guide and relevant parameter information, have been developed and are made available to the general public via the Internet for use

  15. The Glasgow Parallel Reduction Machine: Programming Shared-memory Many-core Systems using Parallel Task Composition

    Directory of Open Access Journals (Sweden)

    Ashkan Tousimojarad

    2013-12-01

    Full Text Available We present the Glasgow Parallel Reduction Machine (GPRM, a novel, flexible framework for parallel task-composition based many-core programming. We allow the programmer to structure programs into task code, written as C++ classes, and communication code, written in a restricted subset of C++ with functional semantics and parallel evaluation. In this paper we discuss the GPRM, the virtual machine framework that enables the parallel task composition approach. We focus the discussion on GPIR, the functional language used as the intermediate representation of the bytecode running on the GPRM. Using examples in this language we show the flexibility and power of our task composition framework. We demonstrate the potential using an implementation of a merge sort algorithm on a 64-core Tilera processor, as well as on a conventional Intel quad-core processor and an AMD 48-core processor system. We also compare our framework with OpenMP tasks in a parallel pointer chasing algorithm running on the Tilera processor. Our results show that the GPRM programs outperform the corresponding OpenMP codes on all test platforms, and can greatly facilitate writing of parallel programs, in particular non-data parallel algorithms such as reductions.

  16. Compromised Motor Dexterity Confounds Processing Speed Task Outcomes in Stroke Patients

    Directory of Open Access Journals (Sweden)

    Essie Low

    2017-09-01

    Full Text Available Most conventional measures of information processing speed require motor responses to facilitate performance. However, although not often addressed clinically, motor impairment, whether due to age or acquired brain injury, would be expected to confound the outcome measure of such tasks. The current study recruited 29 patients (20 stroke and 9 transient ischemic attack with documented reduction in dexterity of the dominant hand, and 29 controls, to investigate the extent to which 3 commonly used processing speed measures with varying motor demands (a Visuo-Motor Reaction Time task, and the Wechsler Adult Intelligence Scale-IV Symbol Search and Coding subtests may be measuring motor-related speed more so than cognitive speed. Analyses include correlations between indices of cognitive and motor speed obtained from two other tasks (Inspection Time and Pegboard task, respectively with the three speed measures, followed by hierarchical regressions to determine the relative contribution of cognitive and motor speed indices toward task performance. Results revealed that speed outcomes on tasks with relatively high motor demands, such as Coding, were largely reflecting motor speed in individuals with reduced dominant hand dexterity. Thus, findings indicate the importance of employing measures with minimal motor requirements, especially when the assessment of speed is aimed at understanding cognitive rather than physical function.

  17. Applications of (a,b)-continued fraction transformations

    OpenAIRE

    Katok, Svetlana; Ugarcovici, Ilie

    2011-01-01

    We describe a general method of arithmetic coding of geodesics on the modular surface based on a two parameter family of continued fraction transformations studied previously by the authors. The finite rectangular structure of the attractors of the natural extension maps and the corresponding "reduction theory" play an essential role. In special cases, when an (a,b)-expansion admits a so-called "dual", the coding sequences are obtained by juxtaposition of the boundary expansions of the fixed ...

  18. (U) Ristra Next Generation Code Report

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Daniel, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-22

    LANL’s Weapons Physics management (ADX) and ASC program office have defined a strategy for exascale-class application codes that follows two supportive, and mutually risk-mitigating paths: evolution for established codes (with a strong pedigree within the user community) based upon existing programming paradigms (MPI+X); and Ristra (formerly known as NGC), a high-risk/high-reward push for a next-generation multi-physics, multi-scale simulation toolkit based on emerging advanced programming systems (with an initial focus on data-flow task-based models exemplified by Legion [5]). Development along these paths is supported by the ATDM, IC, and CSSE elements of the ASC program, with the resulting codes forming a common ecosystem, and with algorithm and code exchange between them anticipated. Furthermore, solution of some of the more challenging problems of the future will require a federation of codes working together, using established-pedigree codes in partnership with new capabilities as they come on line. The role of Ristra as the high-risk/high-reward path for LANL’s codes is fully consistent with its role in the Advanced Technology Development and Mitigation (ATDM) sub-program of ASC (see Appendix C), in particular its emphasis on evolving ASC capabilities through novel programming models and data management technologies.

  19. An overview of the major changes in the 2002 APA Ethics Code.

    Science.gov (United States)

    Knapp, Samuel; VandeCreek, Leon

    2003-06-01

    This article summarizes the major changes that were made to the 2002 Ethical Principles and Code of Conduct of the American Psychological Association. The 2002 Ethics Code retains the general format of the 1992 Ethics Code and does not radically alter the obligations of psychologists. One goal of the Ethics Committee Task Force was to reduce the potential of the Ethics Code to be used to unnecessarily punish psychologists. In addition, the revised Ethics Code expresses greater sensitivity to the needs of cultural and linguistic minorities and students. Shortcomings of the 2002 Ethics Code are discussed.

  20. Managing diversity and enhancing team outcomes: the promise of transformational leadership.

    Science.gov (United States)

    Kearney, Eric; Gebert, Diether

    2009-01-01

    In a sample of 62 research and development (R&D) teams, the authors examined transformational leadership as a moderator of the relationship of age, nationality, and educational background diversity with team outcomes. When levels of transformational leadership were high, nationality and educational diversity were positively related to team leaders' longitudinal ratings of team performance. These relationships were nonsignificant when transformational leadership was low. Age diversity was not related to team performance when transformational leadership was high, and it was negatively related to team performance when transformational leadership was low. Two mediated moderation effects help explain these findings. Transformational leadership moderated the relationship of the 3 examined diversity dimensions with the elaboration of task-relevant information, which in turn was positively associated with team performance. Moreover, transformational leadership moderated the relationship of the 3 diversity types with collective team identification, which in turn was positively related to the elaboration of task-relevant information. The authors discuss the theoretical and practical implications of these results. Overall, this study suggests that transformational leadership can foster the utilization of the potential, but frequently untapped, benefits entailed by both demographic and informational/cognitive team diversity. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  1. Graphical programming of telerobotic tasks

    International Nuclear Information System (INIS)

    Small, D.E.; McDonald, M.J.

    1997-01-01

    With a goal of producing faster, safer, and cheaper technologies for nuclear waste cleanup, Sandia is actively developing and extending intelligent systems technologies. Graphical Programming is a key technology for robotic waste cleanup that Sandia is developing for this goal. This paper describes Sancho, Sandia most advanced Graphical Programming supervisory software. Sancho, now operational on several robot systems, incorporates all of Sandia's recent advances in supervisory control. Sancho, developed to rapidly apply Graphical Programming on a diverse set of robot systems, uses a general set of tools to implement task and operational behavior. Sancho can be rapidly reconfigured for new tasks and operations without modifying the supervisory code. Other innovations include task-based interfaces, event-based sequencing, and sophisticated GUI design. These innovations have resulted in robot control programs and approaches that are easier and safer to use than teleoperation, off-line programming, or full automation

  2. Re-estimation of Motion and Reconstruction for Distributed Video Coding

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Raket, Lars Lau; Forchhammer, Søren

    2014-01-01

    Transform domain Wyner-Ziv (TDWZ) video coding is an efficient approach to distributed video coding (DVC), which provides low complexity encoding by exploiting the source statistics at the decoder side. The DVC coding efficiency depends mainly on side information and noise modeling. This paper...... proposes a motion re-estimation technique based on optical flow to improve side information and noise residual frames by taking partially decoded information into account. To improve noise modeling, a noise residual motion re-estimation technique is proposed. Residual motion compensation with motion...

  3. A task based design procedure and modelling approached for industrial crystallization processes

    NARCIS (Netherlands)

    Menon, A.R.

    2006-01-01

    A synthesis-based approach to the design of crystallizers and industrial crystallization processes is introduced in this thesis. An ontology for a task-based design procedure has been developed which breaks the crystallization process into a subset of basic functions (physical tasks) which transform

  4. Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Derek Elswick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-19

    This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performing a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.

  5. Supervised Learning Based on Temporal Coding in Spiking Neural Networks.

    Science.gov (United States)

    Mostafa, Hesham

    2017-08-01

    Gradient descent training techniques are remarkably successful in training analog-valued artificial neural networks (ANNs). Such training techniques, however, do not transfer easily to spiking networks due to the spike generation hard nonlinearity and the discrete nature of spike communication. We show that in a feedforward spiking network that uses a temporal coding scheme where information is encoded in spike times instead of spike rates, the network input-output relation is differentiable almost everywhere. Moreover, this relation is piecewise linear after a transformation of variables. Methods for training ANNs thus carry directly to the training of such spiking networks as we show when training on the permutation invariant MNIST task. In contrast to rate-based spiking networks that are often used to approximate the behavior of ANNs, the networks we present spike much more sparsely and their behavior cannot be directly approximated by conventional ANNs. Our results highlight a new approach for controlling the behavior of spiking networks with realistic temporal dynamics, opening up the potential for using these networks to process spike patterns with complex temporal information.

  6. Relations and effects of transformational leadership: a comparative analysis with traditional leadership styles.

    Science.gov (United States)

    Molero, Fernando; Cuadrado, Isabel; Navas, Marisol; Morales, J Francisco

    2007-11-01

    This study has two main goals: (a) to compare the relationship between transformational leadership and other important leadership styles (i.e., democratic versus autocratic or relations- and task-oriented leadership) and (b) to compare the effects of transformational leadership and the other styles on some important organizational outcomes such as employees' satisfaction and performance. For this purpose, a sample of 147 participants, working in 35 various work-teams, was used. Results show high correlations between transformational leadership, relations-oriented, democratic, and task-oriented leadership. On the other hand, according to the literature, transformational leadership, especially high levels, significantly increases the percentage of variance accounted for by other leadership styles in relevant organizational outcome variables (subordinates' performance, satisfaction and extra effort).

  7. Task 7: ADPAC User's Manual

    Science.gov (United States)

    Hall, E. J.; Topp, D. A.; Delaney, R. A.

    1996-01-01

    The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields. The current version of the computer code resulting from this study is referred to as ADPAC (Advanced Ducted Propfan Analysis Codes-Version 7). This report is intended to serve as a computer program user's manual for the ADPAC code developed under Tasks 6 and 7 of the NASA Contract. The ADPAC program is based on a flexible multiple- block grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. An iterative implicit algorithm is available for rapid time-dependent flow calculations, and an advanced two equation turbulence model is incorporated to predict complex turbulent flows. The consolidated code generated during this study is capable of executing in either a serial or parallel computing mode from a single source code. Numerous examples are given in the form of test cases to demonstrate the utility of this approach for predicting the aerodynamics of modem turbomachinery configurations.

  8. Wavelet based multicarrier code division multiple access ...

    African Journals Online (AJOL)

    This paper presents the study on Wavelet transform based Multicarrier Code Division Multiple Access (MC-CDMA) system for a downlink wireless channel. The performance of the system is studied for Additive White Gaussian Noise Channel (AWGN) and slowly varying multipath channels. The bit error rate (BER) versus ...

  9. Bad-good constraints on a polarity correspondence account for the spatial-numerical association of response codes (SNARC) and markedness association of response codes (MARC) effects.

    Science.gov (United States)

    Leth-Steensen, Craig; Citta, Richie

    2016-01-01

    Performance in numerical classification tasks involving either parity or magnitude judgements is quicker when small numbers are mapped onto a left-sided response and large numbers onto a right-sided response than for the opposite mapping (i.e., the spatial-numerical association of response codes or SNARC effect). Recent research by Gevers et al. [Gevers, W., Santens, S., Dhooge, E., Chen, Q., Van den Bossche, L., Fias, W., & Verguts, T. (2010). Verbal-spatial and visuospatial coding of number-space interactions. Journal of Experimental Psychology: General, 139, 180-190] suggests that this effect also arises for vocal "left" and "right" responding, indicating that verbal-spatial coding has a role to play in determining it. Another presumably verbal-based, spatial-numerical mapping phenomenon is the linguistic markedness association of response codes (MARC) effect whereby responding in parity tasks is quicker when odd numbers are mapped onto left-sided responses and even numbers onto right-sided responses. A recent account of both the SNARC and MARC effects is based on the polarity correspondence principle [Proctor, R. W., & Cho, Y. S. (2006). Polarity correspondence: A general principle for performance of speeded binary classification tasks. Psychological Bulletin, 132, 416-442]. This account assumes that stimulus and response alternatives are coded along any number of dimensions in terms of - and + polarities with quicker responding when the polarity codes for the stimulus and the response correspond. In the present study, even-odd parity judgements were made using either "left" and "right" or "bad" and "good" vocal responses. Results indicated that a SNARC effect was indeed present for the former type of vocal responding, providing further evidence for the sufficiency of the verbal-spatial coding account for this effect. However, the decided lack of an analogous SNARC-like effect in the results for the latter type of vocal responding provides an important

  10. Development of the PARVMEC Code for Rapid Analysis of 3D MHD Equilibrium

    Science.gov (United States)

    Seal, Sudip; Hirshman, Steven; Cianciosa, Mark; Wingen, Andreas; Unterberg, Ezekiel; Wilcox, Robert; ORNL Collaboration

    2015-11-01

    The VMEC three-dimensional (3D) MHD equilibrium has been used extensively for designing stellarator experiments and analyzing experimental data in such strongly 3D systems. Recent applications of VMEC include 2D systems such as tokamaks (in particular, the D3D experiment), where application of very small (delB/B ~ 10-3) 3D resonant magnetic field perturbations render the underlying assumption of axisymmetry invalid. In order to facilitate the rapid analysis of such equilibria (for example, for reconstruction purposes), we have undertaken the task of parallelizing the VMEC code (PARVMEC) to produce a scalable and temporally rapidly convergent equilibrium code for use on parallel distributed memory platforms. The parallelization task naturally splits into three distinct parts 1) radial surfaces in the fixed-boundary part of the calculation; 2) two 2D angular meshes needed to compute the Green's function integrals over the plasma boundary for the free-boundary part of the code; and 3) block tridiagonal matrix needed to compute the full (3D) pre-conditioner near the final equilibrium state. Preliminary results show that scalability is achieved for tasks 1 and 3, with task 2 still nearing completion. The impact of this work on the rapid reconstruction of D3D plasmas using PARVMEC in the V3FIT code will be discussed. Work supported by U.S. DOE under Contract DE-AC05-00OR22725 with UT-Battelle, LLC.

  11. About the Code of Practice of the European Mathematical Society

    DEFF Research Database (Denmark)

    Jensen, Arne

    2013-01-01

    The Executive Committee of the European Mathematical Society created an Ethics Committee in the Spring of 2010. The first task of the Committee was to prepare a Code of Practice. This task was completed in the Spring of 2012 and went into effect on 1 November 2012. Arne Jensen, author...... of this article, is Chair of the EMS Ethics Committee...

  12. Task complexity and transformational leadership : The mediating role of leaders' state core self-evaluations

    NARCIS (Netherlands)

    Dóci, Edina; Hofmans, Joeri

    2015-01-01

    While substantial scholarly attention has been paid to the beneficial consequences of transformational leadership and the conditions in which this leadership style is most effective, there is a remarkable shortage of research on the contextual antecedents of transformational leadership behavior

  13. Parallel processing approach to transform-based image coding

    Science.gov (United States)

    Normile, James O.; Wright, Dan; Chu, Ken; Yeh, Chia L.

    1991-06-01

    This paper describes a flexible parallel processing architecture designed for use in real time video processing. The system consists of floating point DSP processors connected to each other via fast serial links, each processor has access to a globally shared memory. A multiple bus architecture in combination with a dual ported memory allows communication with a host control processor. The system has been applied to prototyping of video compression and decompression algorithms. The decomposition of transform based algorithms for decompression into a form suitable for parallel processing is described. A technique for automatic load balancing among the processors is developed and discussed, results ar presented with image statistics and data rates. Finally techniques for accelerating the system throughput are analyzed and results from the application of one such modification described.

  14. Transformation - Herding the Cats Towards Service Interdependence

    National Research Council Canada - National Science Library

    Pope, Thomas

    2004-01-01

    U.S. Department of Defense efforts to transform the military is a daunting task. Adapting to a security environment shaped by faceless threats, globalization, and the emergence of the information age requires a change in Service culture...

  15. Lossy to lossless object-based coding of 3-D MRI data.

    Science.gov (United States)

    Menegaz, Gloria; Thiran, Jean-Philippe

    2002-01-01

    We propose a fully three-dimensional (3-D) object-based coding system exploiting the diagnostic relevance of the different regions of the volumetric data for rate allocation. The data are first decorrelated via a 3-D discrete wavelet transform. The implementation via the lifting steps scheme allows to map integer-to-integer values, enabling lossless coding, and facilitates the definition of the object-based inverse transform. The coding process assigns disjoint segments of the bitstream to the different objects, which can be independently accessed and reconstructed at any up-to-lossless quality. Two fully 3-D coding strategies are considered: embedded zerotree coding (EZW-3D) and multidimensional layered zero coding (MLZC), both generalized for region of interest (ROI)-based processing. In order to avoid artifacts along region boundaries, some extra coefficients must be encoded for each object. This gives rise to an overheading of the bitstream with respect to the case where the volume is encoded as a whole. The amount of such extra information depends on both the filter length and the decomposition depth. The system is characterized on a set of head magnetic resonance images. Results show that MLZC and EZW-3D have competitive performances. In particular, the best MLZC mode outperforms the others state-of-the-art techniques on one of the datasets for which results are available in the literature.

  16. Sex differences in components of imagined perspective transformation.

    Science.gov (United States)

    Gardner, Mark R; Sorhus, Ingrid; Edmonds, Caroline J; Potts, Rosalind

    2012-05-01

    Little research to date has examined whether sex differences in spatial ability extend to the mental self rotation involved in taking on a third party perspective. This question was addressed in the present study by assessing components of imagined perspective transformations in twenty men and twenty women. Participants made speeded left-right judgements about the hand in which an object was held by front- and back- facing schematic human figures in an "own body transformation task." Response times were longer when the figure did not share the same spatial orientation as the participant, and were substantially longer than those made for a control task requiring left-right judgements about the same stimuli from the participant's own point of view. A sex difference in imagined perspective transformation favouring males was found to be restricted to the speed of imagined self rotation, and was not observed for components indexing readiness to take a third party point of view, nor in left-right confusion. These findings indicate that the range of spatial abilities for which a sex difference has been established should be extended to include imagined perspective transformations. They also suggest that imagined perspective transformations may not draw upon those empathic social-emotional perspective taking processes for which females show an advantage. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Universal Regularizers For Robust Sparse Coding and Modeling

    OpenAIRE

    Ramirez, Ignacio; Sapiro, Guillermo

    2010-01-01

    Sparse data models, where data is assumed to be well represented as a linear combination of a few elements from a dictionary, have gained considerable attention in recent years, and their use has led to state-of-the-art results in many signal and image processing tasks. It is now well understood that the choice of the sparsity regularization term is critical in the success of such models. Based on a codelength minimization interpretation of sparse coding, and using tools from universal coding...

  18. Early puzzle play: a predictor of preschoolers' spatial transformation skill.

    Science.gov (United States)

    Levine, Susan C; Ratliff, Kristin R; Huttenlocher, Janellen; Cannon, Joanna

    2012-03-01

    Individual differences in spatial skill emerge prior to kindergarten entry. However, little is known about the early experiences that may contribute to these differences. The current study examined the relation between children's early puzzle play and their spatial skill. Children and parents (n = 53) were observed at home for 90 min every 4 months (6 times) between 2 and 4 years of age (26 to 46 months). When children were 4 years 6 months old, they completed a spatial task involving mental transformations of 2-dimensional shapes. Children who were observed playing with puzzles performed better on this task than those who did not, controlling for parent education, income, and overall parent word types. Moreover, among those children who played with puzzles, frequency of puzzle play predicted performance on the spatial transformation task. Although the frequency of puzzle play did not differ for boys and girls, the quality of puzzle play (a composite of puzzle difficulty, parent engagement, and parent spatial language) was higher for boys than for girls. In addition, variation in puzzle play quality predicted performance on the spatial transformation task for girls but not for boys. Implications of these findings as well as future directions for research on the role of puzzle play in the development of spatial skill are discussed. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  19. Early Puzzle Play: A predictor of preschoolers’ spatial transformation skill

    Science.gov (United States)

    Levine, S.C.; Ratliff, K.R.; Huttenlocher, J.; Cannon, J.

    2011-01-01

    Individual differences in spatial skill emerge prior to kindergarten entry. However, little is known about the early experiences that may contribute to these differences. The current study examines the relation between children’s early puzzle play and their spatial skill. Children and parents (n = 53) were observed at home for 90 minutes every four months (six times) between 2 and 4 years of age (26 to 46 months). When children were 4 years 6 months old, they completed a spatial task involving mental transformations of 2D shapes. Children who were observed playing with puzzles performed better on this task than those who did not, controlling for parent education, income, and overall parent word types. Moreover, among those children who played with puzzles, frequency of puzzle play predicted performance on the spatial transformation task. Although the frequency of puzzle play did not differ for boys and girls, the quality of puzzle play (a composite of puzzle difficulty, parent engagement, and parent spatial language) was higher for boys than girls. In addition, variation in puzzle play quality predicted performance on the spatial transformation task for girls but not boys. Implications of these findings as well as future directions for research on the role of the role of puzzle play in the development of spatial skill are discussed. PMID:22040312

  20. New Mandates and Imperatives in the Revised "ACA Code of Ethics"

    Science.gov (United States)

    Kaplan, David M.; Kocet, Michael M.; Cottone, R. Rocco; Glosoff, Harriet L.; Miranti, Judith G.; Moll, E. Christine; Bloom, John W.; Bringaze, Tammy B.; Herlihy, Barbara; Lee, Courtland C.; Tarvydas, Vilia M.

    2009-01-01

    The first major revision of the "ACA Code of Ethics" in a decade occurred in late 2005, with the updated edition containing important new mandates and imperatives. This article provides interviews with members of the Ethics Revision Task Force that flesh out seminal changes in the revised "ACA Code of Ethics" in the areas of confidentiality,…

  1. Fractional Hartley transform applied to optical image encryption

    Science.gov (United States)

    Jimenez, C.; Torres, C.; Mattos, L.

    2011-01-01

    A new method for image encryption is introduced on the basis of two-dimensional (2-D) generalization of 1-D fractional Hartley transform that has been redefined recently in search of its inverse transform We encrypt the image by two fractional orders and random phase codes. It has an advantage over Hartley transform, for its fractional orders can also be used as addictional keys, and that, of course, strengthens image security. Only when all of these keys are correct, can the image be well decrypted. Computer simulations are also perfomed to confirm the possibilty of proposed method.

  2. Fractional Hartley transform applied to optical image encryption

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, C [Grupo GIFES. Universidad de La Guajira. Riohacha (Colombia); Torres, C; Mattos, L, E-mail: carlosj114@gmail.com [Grupo LOI. Universidad Popular del Cesar. Valledupar (Colombia)

    2011-01-01

    A new method for image encryption is introduced on the basis of two-dimensional (2-D) generalization of 1-D fractional Hartley transform that has been redefined recently in search of its inverse transform We encrypt the image by two fractional orders and random phase codes. It has an advantage over Hartley transform, for its fractional orders can also be used as addictional keys, and that, of course, strengthens image security. Only when all of these keys are correct, can the image be well decrypted. Computer simulations are also perfomed to confirm the possibility of proposed method.

  3. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-12-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  4. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup

    2017-04-11

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaickingand 4D light field view synthesis.

  5. Consensus Convolutional Sparse Coding

    KAUST Repository

    Choudhury, Biswarup; Swanson, Robin; Heide, Felix; Wetzstein, Gordon; Heidrich, Wolfgang

    2017-01-01

    Convolutional sparse coding (CSC) is a promising direction for unsupervised learning in computer vision. In contrast to recent supervised methods, CSC allows for convolutional image representations to be learned that are equally useful for high-level vision tasks and low-level image reconstruction and can be applied to a wide range of tasks without problem-specific retraining. Due to their extreme memory requirements, however, existing CSC solvers have so far been limited to low-dimensional problems and datasets using a handful of low-resolution example images at a time. In this paper, we propose a new approach to solving CSC as a consensus optimization problem, which lifts these limitations. By learning CSC features from large-scale image datasets for the first time, we achieve significant quality improvements in a number of imaging tasks. Moreover, the proposed method enables new applications in high-dimensional feature learning that has been intractable using existing CSC methods. This is demonstrated for a variety of reconstruction problems across diverse problem domains, including 3D multispectral demosaicing and 4D light field view synthesis.

  6. SCRAM reactivity calculations with the KIKO3D code

    International Nuclear Information System (INIS)

    Hordosy, G.; Kerszturi, A.; Maraczy, Cs.; Temesvari, E.

    1999-01-01

    Discrepancies between calculated static reactivities and measured reactivities evaluated with reactivity meters led to investigating SCRAM with the KIKO3D dynamic code, The time and space dependent neutron flux in the reactor core during the rod drop measurement was calculated by the KIKO3D nodal diffusion code. For calculating the ionisation chamber signals the Green function technique was applied. The Green functions of ionisation chambers were evaluated via solving the neutron transport equation in the reflector regions with the MCNP Monte Carlo code. The detector signals during asymmetric SCRAM measurements were calculated and compared with measured data using the inverse point kinetics transformation. The sufficient agreement validates the KIKO3D code to determine the reactivities after SCRAM. (Authors)

  7. Cyclic transformation of orbital angular momentum modes

    International Nuclear Information System (INIS)

    Schlederer, Florian; Krenn, Mario; Fickler, Robert; Malik, Mehul; Zeilinger, Anton

    2016-01-01

    The spatial modes of photons are one realization of a QuDit, a quantum system that is described in a D-dimensional Hilbert space. In order to perform quantum information tasks with QuDits, a general class of D-dimensional unitary transformations is needed. Among these, cyclic transformations are an important special case required in many high-dimensional quantum communication protocols. In this paper, we experimentally demonstrate a cyclic transformation in the high-dimensional space of photonic orbital angular momentum (OAM). Using simple linear optical components, we show a successful four-fold cyclic transformation of OAM modes. Interestingly, our experimental setup was found by a computer algorithm. In addition to the four-cyclic transformation, the algorithm also found extensions to higher-dimensional cycles in a hybrid space of OAM and polarization. Besides being useful for quantum cryptography with QuDits, cyclic transformations are key for the experimental production of high-dimensional maximally entangled Bell-states. (paper)

  8. Benchmarking of computer codes and approaches for modeling exposure scenarios

    International Nuclear Information System (INIS)

    Seitz, R.R.; Rittmann, P.D.; Wood, M.I.; Cook, J.R.

    1994-08-01

    The US Department of Energy Headquarters established a performance assessment task team (PATT) to integrate the activities of DOE sites that are preparing performance assessments for the disposal of newly generated low-level waste. The PATT chartered a subteam with the task of comparing computer codes and exposure scenarios used for dose calculations in performance assessments. This report documents the efforts of the subteam. Computer codes considered in the comparison include GENII, PATHRAE-EPA, MICROSHIELD, and ISOSHLD. Calculations were also conducted using spreadsheets to provide a comparison at the most fundamental level. Calculations and modeling approaches are compared for unit radionuclide concentrations in water and soil for the ingestion, inhalation, and external dose pathways. Over 30 tables comparing inputs and results are provided

  9. Hybrid Video Coding Based on Bidimensional Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Lorenzo Granai

    2004-12-01

    Full Text Available Hybrid video coding combines together two stages: first, motion estimation and compensation predict each frame from the neighboring frames, then the prediction error is coded, reducing the correlation in the spatial domain. In this work, we focus on the latter stage, presenting a scheme that profits from some of the features introduced by the standard H.264/AVC for motion estimation and replaces the transform in the spatial domain. The prediction error is so coded using the matching pursuit algorithm which decomposes the signal over an appositely designed bidimensional, anisotropic, redundant dictionary. Comparisons are made among the proposed technique, H.264, and a DCT-based coding scheme. Moreover, we introduce fast techniques for atom selection, which exploit the spatial localization of the atoms. An adaptive coding scheme aimed at optimizing the resource allocation is also presented, together with a rate-distortion study for the matching pursuit algorithm. Results show that the proposed scheme outperforms the standard DCT, especially at very low bit rates.

  10. Multiresolution signal decomposition transforms, subbands, and wavelets

    CERN Document Server

    Akansu, Ali N; Haddad, Paul R

    2001-01-01

    The uniqueness of this book is that it covers such important aspects of modern signal processing as block transforms from subband filter banks and wavelet transforms from a common unifying standpoint, thus demonstrating the commonality among these decomposition techniques. In addition, it covers such ""hot"" areas as signal compression and coding, including particular decomposition techniques and tables listing coefficients of subband and wavelet filters and other important properties.The field of this book (Electrical Engineering/Computer Science) is currently booming, which is, of course

  11. A Potential Issue Involving the Application of the Unit Base Transformation to the Interpolation of Secondary Energy Distributions

    International Nuclear Information System (INIS)

    T Sutton; T Trumbull

    2005-01-01

    Secondary neutron energy spectra used by Monte Carlo codes are often provided in tabular format. Examples are the spectra obtained from ENDF/B-VI File 5 when the LF parameter has the value 1. These secondary spectra are tabulated on an incident energy mesh, and in a Monte Carlo calculation the tabulated spectra are generally interpolated to the energy of the incident neutron. A common method of interpolation involves the use of the unit base transformation. The details of the implementation vary from code to code, so here we will simply focus on the mathematics of the method. Given an incident neutron with energy E, the bracketing points E i and E i+1 on the incident energy mesh are determined. The corresponding secondary energy spectra are transformed to a dimensionless energy coordinate system in which the secondary energies lie between zero and one. A dimensionless secondary energy is then sampled from a spectrum obtained by linearly interpolating the transformed spectra--often using the method of statistical interpolation. Finally, the sampled secondary energy is transformed back into the normal energy coordinate system. For this inverse transformation, the minimum and maximum energies are linearly interpolated from the values given in the non-transformed secondary spectra. The purpose of the unit base transformation is to preserve (as nearly as possible) the physics of the secondary distribution--in particular the minimum and maximum energies possible for the secondary neutron. This method is used by several codes including MCNP and the new MC21 code that is the subject of this paper. In comparing MC21 results to those of MCNP, it was discovered that the nuclear data supplied to MCNP is structured in such a way that the code may not be doing the best possible job of preserving the physics of certain nuclear interactions. In this paper, we describe the problem and explain how it may be avoided

  12. RCS modeling with the TSAR FDTD code

    Energy Technology Data Exchange (ETDEWEB)

    Pennock, S.T.; Ray, S.L.

    1992-03-01

    The TSAR electromagnetic modeling system consists of a family of related codes that have been designed to work together to provide users with a practical way to set up, run, and interpret the results from complex 3-D finite-difference time-domain (FDTD) electromagnetic simulations. The software has been in development at the Lawrence Livermore National Laboratory (LLNL) and at other sites since 1987. Active internal use of the codes began in 1988 with limited external distribution and use beginning in 1991. TSAR was originally developed to analyze high-power microwave and EMP coupling problems. However, the general-purpose nature of the tools has enabled us to use the codes to solve a broader class of electromagnetic applications and has motivated the addition of new features. In particular a family of near-to-far field transformation routines have been added to the codes, enabling TSAR to be used for radar-cross section and antenna analysis problems.

  13. Impaired visuospatial transformation but intact sequence processing in Parkinson disease.

    Science.gov (United States)

    Leek, E Charles; Kerai, Julie H; Johnston, Stephen J; Hindle, John V; Bracewell, R Martyn

    2014-09-01

    We examined whether visuospatial deficits in Parkinson disease (PD) can be explained by a domain-general, nonspatial impairment in the sequencing or serial chaining of mental operations. PD has been shown to be associated with impaired visuospatial processing, but the mechanisms of this impairment remain unclear. Thirteen patients with PD and 20 age-matched, neurologically normal controls performed a visuospatial grid navigation task requiring sequential spatial transformations. The participants also performed a control task of serial number subtraction designed to assess their nonvisuospatial sequencing. The tasks were matched in structure and difficulty. The patients were impaired on the visuospatial task but not in serial number subtraction. This finding suggests that visuospatial processing impairments in PD do not derive from a general impairment affecting sequencing or serial chaining. We argue that visuospatial deficits in PD result from impairments to spatial transformation routines involved in the computation of mappings between spatial locations. These routines are mediated by dopaminergic pathways linking the basal ganglia, prefrontal cortex, supplementary motor area, and parietal cortex.

  14. Evolving a Dynamic Predictive Coding Mechanism for Novelty Detection

    OpenAIRE

    Haggett, Simon J.; Chu, Dominique; Marshall, Ian W.

    2007-01-01

    Novelty detection is a machine learning technique which identifies new or unknown information in data sets. We present our current work on the construction of a new novelty detector based on a dynamical version of predictive coding. We compare three evolutionary algorithms, a simple genetic algorithm, NEAT and FS-NEAT, for the task of optimising the structure of an illustrative dynamic predictive coding neural network to improve its performance over stimuli from a number of artificially gener...

  15. Codes of conduct in public schools: a legal perspective

    African Journals Online (AJOL)

    Erna Kinsey

    cation change in South Africa, particularly the transformation of public schools ... been granted legal personality to act as "juristic persons" (i.e. legal persons ..... cess, a decision is made to amend, or repeal, the code of conduct, de- pending on ...

  16. Multispectral image pansharpening based on the contourlet transform

    Energy Technology Data Exchange (ETDEWEB)

    Amro, Israa; Mateos, Javier, E-mail: iamro@correo.ugr.e, E-mail: jmd@decsai.ugr.e [Departamento de Ciencias de la Computacion e I.A., Universidad de Granada, 18071 Granada (Spain)

    2010-02-01

    Pansharpening is a technique that fuses the information of a low resolution multispectral image (MS) and a high resolution panchromatic image (PAN), usually remote sensing images, to provide a high resolution multispectral image. In the literature, this task has been addressed from different points of view being one of the most popular the wavelets based algorithms. Recently, the contourlet transform has been proposed. This transform combines the advantages of the wavelets transform with a more efficient directional information representation. In this paper we propose a new pansharpening method based on contourlets, compare with its wavelet counterpart and assess its performance numerically and visually.

  17. The WIMS familly of codes

    International Nuclear Information System (INIS)

    Askew, J.

    1981-01-01

    WIMS-D4 is the latest version of the original form of the Winfrith Improved Multigroup Scheme, developed in 1963-5 for lattice calculations on all types of thermal reactor, whether moderated by graphite, heavy or light water. The code, in earlier versions, has been available from the NEA code centre for a number of years in both IBM and CDC dialects of FORTRAN. An important feature of this code was its rapid, accurate deterministic system for treating resonance capture in heavy nuclides, and capable of dealing with both regular pin lattices and with cluster geometries typical of pressure tube and gas cooled reactors. WIMS-E is a compatible code scheme in which each calcultation step is bounded by standard interfaces on disc or tape. The interfaces contain files of information in a standard form, restricted to numbers representing physically meaningful quantities such as cross-sections and fluxes. Restriction of code intercommunication to this channel limits the possible propagation of errors. A module is capable of transforming WIMS-D output into the standard interface form and hence the two schemes can be linked if required. LWR-WIMS was developed in 1970 as a method of calculating LWR reloads for the fuel fabricators BNFL/GUNF. It uses the WIMS-E library and a number of the same module

  18. ADLIB: A simple database framework for beamline codes

    International Nuclear Information System (INIS)

    Mottershead, C.T.

    1993-01-01

    There are many well developed codes available for beamline design and analysis. A significant fraction of each of these codes is devoted to processing its own unique input language for describing the problem. None of these large, complex, and powerful codes does everything. Adding a new bit of specialized physics can be a difficult task whose successful completion makes the code even larger and more complex. This paper describes an attempt to move in the opposite direction, toward a family of small, simple, single purpose physics and utility modules, linked by an open, portable, public domain database framework. These small specialized physics codes begin with the beamline parameters already loaded in the database, and accessible via the handful of subroutines that constitute ADLIB. Such codes are easier to write, and inherently organized in a manner suitable for incorporation in model based control system algorithms. Examples include programs for analyzing beamline misalignment sensitivities, for simulating and fitting beam steering data, and for translating among MARYLIE, TRANSPORT, and TRACE3D formats

  19. A Systematic Hardware Sharing Method for Unified Architecture Design of H.264 Transforms

    Directory of Open Access Journals (Sweden)

    Po-Hung Chen

    2015-01-01

    Full Text Available Multitransform techniques have been widely used in modern video coding and have better compression efficiency than the single transform technique that is used conventionally. However, every transform needs a corresponding hardware implementation, which results in a high hardware cost for multiple transforms. A novel method that includes a five-step operation sharing synthesis and architecture-unification techniques is proposed to systematically share the hardware and reduce the cost of multitransform coding. In order to demonstrate the effectiveness of the method, a unified architecture is designed using the method for all of the six transforms involved in the H.264 video codec: 2D 4 × 4 forward and inverse integer transforms, 2D 4 × 4 and 2 × 2 Hadamard transforms, and 1D 8 × 8 forward and inverse integer transforms. Firstly, the six H.264 transform architectures are designed at a low cost using the proposed five-step operation sharing synthesis technique. Secondly, the proposed architecture-unification technique further unifies these six transform architectures into a low cost hardware-unified architecture. The unified architecture requires only 28 adders, 16 subtractors, 40 shifters, and a proposed mux-based routing network, and the gate count is only 16308. The unified architecture processes 8 pixels/clock-cycle, up to 275 MHz, which is equal to 707 Full-HD 1080 p frames/second.

  20. Obtaining a radiation beam poly energy using the code Penelope 2006

    International Nuclear Information System (INIS)

    Andrade, Lucio das Chagas; Peixoto, Jose Guilherme Pereira

    2013-01-01

    Obtaining a spectrum X-ray is not a very easy task, one of the techniques used is the simulation by Monte Carlo method. The Penelope code is a code based on this method that simulates the transport of particles such as electrons, positrons and photons in different media and materials. The versions of this program in 2003 and 2006 show significant differences for facilitating the use of the code. The program allows the construction of the desired geometry and definitions of simulation parameters. (author)

  1. Optical image encryption with redefined fractional Hartley transform

    Science.gov (United States)

    Zhao, Daomu; Li, Xinxin; Chen, Linfei

    2008-11-01

    A new method for optical image encryption is introduced on the basis of two-dimensional (2-D) generalization of 1-D fractional Hartley transform that has been redefined recently in search of its inverse transform. We encrypt the image by two fractional orders and random phase codes. It has an advantage over Hartley transform, for its fractional orders can also be used as additional keys, and that, of course, strengthens image security. Only when all of these keys are correct, can the image be well decrypted. The optical realization is then proposed and computer simulations are also performed to confirm the possibility of the proposed method.

  2. Embodied mental rotation: A special link between egocentric transformation and the bodily self

    Directory of Open Access Journals (Sweden)

    Sandra eKaltner

    2014-06-01

    Full Text Available This experiment investigated the influence of motor expertise on object-based versus egocentric transformations in a chronometric mental rotation task using images of either the own or another person’s body as stimulus material. According to the embodied cognition viewpoint, we hypothesized motor-experts to outperform non-motor experts specifically in the egocentric condition because of higher kinesthetic representation and motor simulations compared to object-based transformations. In line with this, we expected that images of the own body are solved faster than another person’s body stimuli. Results showed a benefit of motor expertise and representations of another person’s body, but only for the object-based transformation task. That is, this other-advantage diminishes in egocentric transformations. Since motor experts didn’t show any specific expertise in rotational movements, we concluded that using human bodies as stimulus material elicits embodied spatial transformations, which facilitates performance exclusively for egocentric transformations. Regarding stimulus material, the other-advantage ascribed to increased self-awareness-consciousness distracting attention-demanding resources, disappeared in the egocentric condition. This result may be due to the stronger link between the bodily self and motor representations compared to that emerging in object-based transformations.

  3. Multi-optimization Criteria-based Robot Behavioral Adaptability and Motion Planning

    International Nuclear Information System (INIS)

    Pin, Francois G.

    2002-01-01

    Robotic tasks are typically defined in Task Space (e.g., the 3-D World), whereas robots are controlled in Joint Space (motors). The transformation from Task Space to Joint Space must consider the task objectives (e.g., high precision, strength optimization, torque optimization), the task constraints (e.g., obstacles, joint limits, non-holonomic constraints, contact or tool task constraints), and the robot kinematics configuration (e.g., tools, type of joints, mobile platform, manipulator, modular additions, locked joints). Commercially available robots are optimized for a specific set of tasks, objectives and constraints and, therefore, their control codes are extremely specific to a particular set of conditions. Thus, there exist a multiplicity of codes, each handling a particular set of conditions, but none suitable for use on robots with widely varying tasks, objectives, constraints, or environments. On the other hand, most DOE missions and tasks are typically ''batches of one''. Attempting to use commercial codes for such work requires significant personnel and schedule costs for re-programming or adding code to the robots whenever a change in task objective, robot configuration, number and type of constraint, etc. occurs. The objective of our project is to develop a ''generic code'' to implement this Task-space to Joint-Space transformation that would allow robot behavior adaptation, in real time (at loop rate), to changes in task objectives, number and type of constraints, modes of controls, kinematics configuration (e.g., new tools, added module). Our specific goal is to develop a single code for the general solution of under-specified systems of algebraic equations that is suitable for solving the inverse kinematics of robots, is useable for all types of robots (mobile robots, manipulators, mobile manipulators, etc.) with no limitation on the number of joints and the number of controlled Task-Space variables, can adapt to real time changes in number and

  4. The Reduction of Directed Cyclic Graph for Task Assignment Problem

    Directory of Open Access Journals (Sweden)

    Ariffin W.N.M.

    2018-01-01

    Full Text Available In this paper, a directed cyclic graph (DCG is proposed as the task graph. It is undesirable and impossible to complete the task according to the constraints if the cycle exists. Therefore, an effort should be done in order to eliminate the cycle to obtain a directed acyclic graph (DAG, so that the minimum amount of time required for the entire task can be found. The technique of reducing the complexity of the directed cyclic graph to a directed acyclic graph by reversing the orientation of the path is the main contribution of this study. The algorithm was coded using Java programming and consistently produced good assignment and task schedule.

  5. The Evolution of a Coding Schema in a Paced Program of Research

    Science.gov (United States)

    Winters, Charlene A.; Cudney, Shirley; Sullivan, Therese

    2010-01-01

    A major task involved in the management, analysis, and integration of qualitative data is the development of a coding schema to facilitate the analytic process. Described in this paper is the evolution of a coding schema that was used in the analysis of qualitative data generated from online forums of middle-aged women with chronic conditions who…

  6. Calculation code NIRVANA for free boundary MHD equilibrium

    International Nuclear Information System (INIS)

    Ninomiya, Hiromasa; Suzuki, Yasuo; Kameari, Akihisa

    1975-03-01

    The calculation method and code of solving the free boundary problem for MHD equilibrium has been developed. Usage of the code ''NIRVANA'' is described. The toroidal plasma current density determined as a function of the flux function PSI is substituted by a group of the ring currents, whereby the equation of MHD equilibrium is transformed into an integral equation. Either of the two iterative methods is chosen to solve the integral equation, depending on the assumptions made of the plasma surface points. Calculation of the magnetic field configurations is possible when the plasma surface coincides self-consistently with the magnetic flux including the separatrix points. The code is usable in calculation of the circular or non-circular shell-less Tokamak equilibrium. (auth.)

  7. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  8. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  9. Medical image compression with fast Hartley transform

    International Nuclear Information System (INIS)

    Paik, C.H.; Fox, M.D.

    1988-01-01

    The purpose of data compression is storage and transmission of images with minimization of memory for storage and bandwidth for transmission, while maintaining robustness in the presence of transmission noise or storage medium errors. Here, the fast Hartley transform (FHT) is used for transformation and a new thresholding method is devised. The FHT is used instead of the fast Fourier transform (FFT), thus providing calculation at least as fast as that of the fastest algorithm of FFT. This real numbered transform requires only half the memory array space for saving of transform coefficients and allows for easy implementation on very large-scale integrated circuits because of the use of the same formula for both forward and inverse transformation and the conceptually straightforward algorithm. Threshold values were adaptively selected according to the correlation factor of each block of equally divided blocks of the image. Therefore, this approach provided a coding scheme that included maximum information with minimum image bandwidth. Overall, the results suggested that the Hartley transform adaptive thresholding approach results in improved fidelity, shorter decoding time, and greater robustness in the presence of noise than previous approaches

  10. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  11. MHD code using multi graphical processing units: SMAUG+

    Science.gov (United States)

    Gyenge, N.; Griffiths, M. K.; Erdélyi, R.

    2018-01-01

    This paper introduces the Sheffield Magnetohydrodynamics Algorithm Using GPUs (SMAUG+), an advanced numerical code for solving magnetohydrodynamic (MHD) problems, using multi-GPU systems. Multi-GPU systems facilitate the development of accelerated codes and enable us to investigate larger model sizes and/or more detailed computational domain resolutions. This is a significant advancement over the parent single-GPU MHD code, SMAUG (Griffiths et al., 2015). Here, we demonstrate the validity of the SMAUG + code, describe the parallelisation techniques and investigate performance benchmarks. The initial configuration of the Orszag-Tang vortex simulations are distributed among 4, 16, 64 and 100 GPUs. Furthermore, different simulation box resolutions are applied: 1000 × 1000, 2044 × 2044, 4000 × 4000 and 8000 × 8000 . We also tested the code with the Brio-Wu shock tube simulations with model size of 800 employing up to 10 GPUs. Based on the test results, we observed speed ups and slow downs, depending on the granularity and the communication overhead of certain parallel tasks. The main aim of the code development is to provide massively parallel code without the memory limitation of a single GPU. By using our code, the applied model size could be significantly increased. We demonstrate that we are able to successfully compute numerically valid and large 2D MHD problems.

  12. Recurrence of task set-related MEG signal patterns during auditory working memory.

    Science.gov (United States)

    Peters, Benjamin; Bledowski, Christoph; Rieder, Maria; Kaiser, Jochen

    2016-06-01

    Processing of auditory spatial and non-spatial information in working memory has been shown to rely on separate cortical systems. While previous studies have demonstrated differences in spatial versus non-spatial processing from the encoding of to-be-remembered stimuli onwards, here we investigated whether such differences would be detectable already prior to presentation of the sample stimulus. We analyzed broad-band magnetoencephalography data from 15 healthy adults during an auditory working memory paradigm starting with a visual cue indicating the task-relevant stimulus feature for a given trial (lateralization or pitch) and a subsequent 1.5-s pre-encoding phase. This was followed by a sample sound (0.2s), the delay phase (0.8s) and a test stimulus (0.2s) after which participants made a match/non-match decision. Linear discriminant functions were trained to decode task-specific signal patterns throughout the task, and temporal generalization was used to assess whether the neural codes discriminating between the tasks during the pre-encoding phase would recur during later task periods. The spatial versus non-spatial tasks could indeed be discriminated after the onset of the cue onwards, and decoders trained during the pre-encoding phase successfully discriminated the tasks during both sample stimulus encoding and during the delay phase. This demonstrates that task-specific neural codes are established already before the memorandum is presented and that the same patterns are reestablished during stimulus encoding and maintenance. This article is part of a Special Issue entitled SI: Auditory working memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Theta phase precession and phase selectivity: a cognitive device description of neural coding

    Science.gov (United States)

    Zalay, Osbert C.; Bardakjian, Berj L.

    2009-06-01

    Information in neural systems is carried by way of phase and rate codes. Neuronal signals are processed through transformative biophysical mechanisms at the cellular and network levels. Neural coding transformations can be represented mathematically in a device called the cognitive rhythm generator (CRG). Incoming signals to the CRG are parsed through a bank of neuronal modes that orchestrate proportional, integrative and derivative transformations associated with neural coding. Mode outputs are then mixed through static nonlinearities to encode (spatio) temporal phase relationships. The static nonlinear outputs feed and modulate a ring device (limit cycle) encoding output dynamics. Small coupled CRG networks were created to investigate coding functionality associated with neuronal phase preference and theta precession in the hippocampus. Phase selectivity was found to be dependent on mode shape and polarity, while phase precession was a product of modal mixing (i.e. changes in the relative contribution or amplitude of mode outputs resulted in shifting phase preference). Nonlinear system identification was implemented to help validate the model and explain response characteristics associated with modal mixing; in particular, principal dynamic modes experimentally derived from a hippocampal neuron were inserted into a CRG and the neuron's dynamic response was successfully cloned. From our results, small CRG networks possessing disynaptic feedforward inhibition in combination with feedforward excitation exhibited frequency-dependent inhibitory-to-excitatory and excitatory-to-inhibitory transitions that were similar to transitions seen in a single CRG with quadratic modal mixing. This suggests nonlinear modal mixing to be a coding manifestation of the effect of network connectivity in shaping system dynamic behavior. We hypothesize that circuits containing disynaptic feedforward inhibition in the nervous system may be candidates for interpreting upstream rate codes to

  14. [The ICOH International Code of Ethics for Occupational Health Professionals].

    Science.gov (United States)

    Foà, V

    2010-01-01

    In the paper all the steps are described which are followed by ICOH to finalize the International Code of Ethics for Occupational Health Professionals (OHP). The Code is composed by a "Preface" in which is explained why the Occupational Health Professionals need a specific Code different from other Codes built up for general practitioners or other specializations, followed by an "Introduction" where the targets of Occupational Health are underlined and which professionals contribute to achieve the defined target. These two parts are followed by a more substantial description of the tasks and duties of the OHP. In the last part of the Code it is illustrated how to carry out the above mentioned duties. The principles inserted in the ICOH Code of Ethics have been worldwide accepted by the OHP and particularly in Italy where they have been included in the Legislative Decree 81/08.

  15. Accident and safety analyses for the HTR-modul. Partial project 1: Computer codes for system behaviour calculation. Final report. Pt. 2

    International Nuclear Information System (INIS)

    Lohnert, G.; Becker, D.; Dilcher, L.; Doerner, G.; Feltes, W.; Gysler, G.; Haque, H.; Kindt, T.; Kohtz, N.; Lange, L.; Ragoss, H.

    1993-08-01

    The project encompasses the following project tasks and problems: (1) Studies relating to complete failure of the main heat transfer system; (2) Pebble flow; (3) Development of computer codes for detailed calculation of hypothetical accidents; (a) the THERMIX/RZKRIT temperature buildup code (covering a.o. a variation to include exothermal heat sources); (b) the REACT/THERMIX corrosion code (variation taking into account extremely severe air ingress into the primary loop); (c) the GRECO corrosion code (variation for treating extremely severe water ingress into the primary loop); (d) the KIND transients code (for treating extremely fast transients during reactivity incidents. (4) Limiting devices for safety-relevant quantities. (5) Analyses relating to hypothetical accidents. (a) hypothetical air ingress; (b) effects on the fuel particles induced by fast transients. The problems of the various tasks are defined in detail and the main results obtained are explained. The contributions reporting the various project tasks and activities have been prepared for separate retrieval from the database. (orig./HP) [de

  16. Accident and safety analyses for the HTR-modul. Partial project 1: Computer codes for system behaviour calculation. Final report. Pt. 1

    International Nuclear Information System (INIS)

    Lohnert, G.; Becker, D.; Dilcher, L.; Doerner, G.; Feltes, W.; Gysler, G.; Haque, H.; Kindt, T.; Kohtz, N.; Lange, L.; Ragoss, H.

    1993-08-01

    The project encompasses the following project tasks and problems: (1) Studies relating to complete failure of the main heat transfer system; (2) Pebble flow; (3) Development of computer codes for detailed calculation of hypothetical accidents; (a) the THERMIX/RZKRIT temperature buildup code (covering a.o. a variation to include exothermal heat sources); (b) the REACT/THERMIX corrosion code (variation taking into account extremely severe air ingress into the primary loop); (c) the GRECO corrosion code (variation for treating extremely severe water ingress into the primary loop); (d) the KIND transients code (for treating extremely fast transients during reactivity incidents. (4) Limiting devices for safety-relevant quantities. (5) Analyses relating to hypothetical accidents. (a) hypothetical air ingress; (b) effects on the fuel particles induced by fast transients. The problems of the various tasks are defined in detail and the main results obtained are explained. The contributions reporting the various project tasks and activities have been prepared for separate retrieval from the database. (orig./HP) [de

  17. Integrated Task And Data Parallel Programming: Language Design

    Science.gov (United States)

    Grimshaw, Andrew S.; West, Emily A.

    1998-01-01

    his research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers '95 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program m. Additional 1995 Activities During the fall I collaborated

  18. TRU drum corrosion task team report

    Energy Technology Data Exchange (ETDEWEB)

    Kooda, K.E.; Lavery, C.A.; Zeek, D.P.

    1996-05-01

    During routine inspections in March 1996, transuranic (TRU) waste drums stored at the Radioactive Waste Management Complex (RWMC) were found with pinholes and leaking fluid. These drums were overpacked, and further inspection discovered over 200 drums with similar corrosion. A task team was assigned to investigate the problem with four specific objectives: to identify any other drums in RWMC TRU storage with pinhole corrosion; to evaluate the adequacy of the RWMC inspection process; to determine the precise mechanism(s) generating the pinhole drum corrosion; and to assess the implications of this event for WIPP certifiability of waste drums. The task team investigations analyzed the source of the pinholes to be Hcl-induced localized pitting corrosion. Hcl formation is directly related to the polychlorinated hydrocarbon volatile organic compounds (VOCs) in the waste. Most of the drums showing pinhole corrosion are from Content Code-003 (CC-003) because they contain the highest amounts of polychlorinated VOCs as determined by headspace gas analysis. CC-001 drums represent the only other content code with a significant number of pinhole corrosion drums because their headspace gas VOC content, although significantly less than CC-003, is far greater than that of the other content codes. The exact mechanisms of Hcl formation could not be determined, but radiolytic and reductive dechlorination and direct reduction of halocarbons were analyzed as the likely operable reactions. The team considered the entire range of feasible options, ranked and prioritized the alternatives, and recommended the optimal solution that maximizes protection of worker and public safety while minimizing impacts on RWMC and TRU program operations.

  19. TRU drum corrosion task team report

    International Nuclear Information System (INIS)

    Kooda, K.E.; Lavery, C.A.; Zeek, D.P.

    1996-05-01

    During routine inspections in March 1996, transuranic (TRU) waste drums stored at the Radioactive Waste Management Complex (RWMC) were found with pinholes and leaking fluid. These drums were overpacked, and further inspection discovered over 200 drums with similar corrosion. A task team was assigned to investigate the problem with four specific objectives: to identify any other drums in RWMC TRU storage with pinhole corrosion; to evaluate the adequacy of the RWMC inspection process; to determine the precise mechanism(s) generating the pinhole drum corrosion; and to assess the implications of this event for WIPP certifiability of waste drums. The task team investigations analyzed the source of the pinholes to be Hcl-induced localized pitting corrosion. Hcl formation is directly related to the polychlorinated hydrocarbon volatile organic compounds (VOCs) in the waste. Most of the drums showing pinhole corrosion are from Content Code-003 (CC-003) because they contain the highest amounts of polychlorinated VOCs as determined by headspace gas analysis. CC-001 drums represent the only other content code with a significant number of pinhole corrosion drums because their headspace gas VOC content, although significantly less than CC-003, is far greater than that of the other content codes. The exact mechanisms of Hcl formation could not be determined, but radiolytic and reductive dechlorination and direct reduction of halocarbons were analyzed as the likely operable reactions. The team considered the entire range of feasible options, ranked and prioritized the alternatives, and recommended the optimal solution that maximizes protection of worker and public safety while minimizing impacts on RWMC and TRU program operations

  20. Development of 2-d cfd code

    International Nuclear Information System (INIS)

    Mirza, S.A.

    1999-01-01

    In the present study, a two-dimensional computer code has been developed in FORTRAN using CFD technique, which is basically a numerical scheme. This computer code solves the Navier Stokes equations and continuity equation to find out the velocity and pressure fields within a given domain. This analysis has been done for the developed within a square cavity driven by the upper wall which has become a bench mark for testing and comparing the newly developed numerical schemes. Before to handle this task, different one-dimensional cases have been studied by CFD technique and their FORTRAN programs written. The cases studied are Couette flow, Poiseuille flow with and without using symmetric boundary condition. Finally a comparison between CFD results and analytical results has also been made. For the cavity flow the results from the developed code have been obtained for different Reynolds numbers which are finally presented in the form of velocity vectors. The comparison of the developed code results have been made with the results obtained from the share ware version of a commercially available code for Reynolds number of 10.0. The disagreement in the results quantitatively and qualitatively at some grid points of the calculation domain have been discussed and future recommendations in this regard have also been made. (author)

  1. Balanced Reed-Solomon codes for all parameters

    KAUST Repository

    Halbawi, Wael; Liu, Zihan; Hassibi, Babak

    2016-01-01

    We construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes with any length n and dimension k. By sparsest, we mean that each row has the least possible number of nonzeros, while balanced means that the number of nonzeros in any two columns differs by at most one. Codes allowing such encoding schemes are useful in distributed settings where computational load-balancing is critical. The problem was first studied by Dau et al. who showed, using probabilistic arguments, that there always exists an MDS code over a sufficiently large field such that its generator matrix is both sparsest and balanced. Motivated by the need for an explicit construction with efficient decoding, the authors of the current paper showed that the generator matrix of a cyclic Reed-Solomon code of length n and dimension k can always be transformed to one that is both sparsest and balanced, when n and k are such that k/n (n-k+1) is an integer. In this paper, we lift this condition and construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes for any set of parameters.

  2. Balanced Reed-Solomon codes for all parameters

    KAUST Repository

    Halbawi, Wael

    2016-10-27

    We construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes with any length n and dimension k. By sparsest, we mean that each row has the least possible number of nonzeros, while balanced means that the number of nonzeros in any two columns differs by at most one. Codes allowing such encoding schemes are useful in distributed settings where computational load-balancing is critical. The problem was first studied by Dau et al. who showed, using probabilistic arguments, that there always exists an MDS code over a sufficiently large field such that its generator matrix is both sparsest and balanced. Motivated by the need for an explicit construction with efficient decoding, the authors of the current paper showed that the generator matrix of a cyclic Reed-Solomon code of length n and dimension k can always be transformed to one that is both sparsest and balanced, when n and k are such that k/n (n-k+1) is an integer. In this paper, we lift this condition and construct balanced and sparsest generator matrices for cyclic Reed-Solomon codes for any set of parameters.

  3. Puerto Rican kindergartners' self-worth as coded from the Attachment Story Completion Task: correlated with other self-evaluation measures and ratings of child behavior toward mothers and peers.

    Science.gov (United States)

    Gullón-Rivera, Ángel L

    2013-01-01

    This multi-method multi-informant study assessed 105 Puerto Rican kindergartners' sense of self-worth in family relationships as coded from their responses to the Attachment Story Completion Task (ASCT). The ASCT scores were compared with responses to two other age-appropriate self-evaluation measures (the Cassidy Puppet Interview and the Pictorial Scales of Social Acceptance). Correlations of children's scores on the three self-measures with maternal ratings of the mother-child relationship and teacher ratings of the child's prosocial behavior with peers were then compared. ASCT self-worth and Puppet Interview scores were strongly correlated with each other and both were modestly related to the pictorial social acceptance scales. All three measures were significantly associated with maternal and teacher reports of child behavior, but the strongest correlations were obtained with the ASCT. Coding the ASCT in terms of self-worth appears to be a promising approach for evaluating young children's (vicariously expressed) self-worth in family relationships.

  4. Novel Polynomial Basis with Fast Fourier Transform and Its Application to Reed-Solomon Erasure Codes

    KAUST Repository

    Lin, Sian-Jheng; Al-Naffouri, Tareq Y.; Han, Yunghsiang S.; Chung, Wei-Ho

    2016-01-01

    In this paper, we present a fast Fourier transform (FFT) algorithm over extension binary fields, where the polynomial is represented in a non-standard basis. The proposed Fourier-like transform requires O(h lg(h)) field operations, where h

  5. No Code Required Giving Users Tools to Transform the Web

    CERN Document Server

    Cypher, Allen; Lau, Tessa; Nichols, Jeffrey

    2010-01-01

    Revolutionary tools are emerging from research labs that enable all computer users to customize and automate their use of the Web without learning how to program. No Code Required takes cutting edge material from academic and industry leaders - the people creating these tools -- and presents the research, development, application, and impact of a variety of new and emerging systems. *The first book since Web 2.0 that covers the latest research, development, and systems emerging from HCI research labs on end user programming tools *Featuring contributions from the creators of Adobe's Zoet

  6. Qualification of the code d{sup 3}f++

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke; Gehrke, Anne; Kroehn, Klaus-Peter; Zhao, Hong

    2017-02-15

    The code d{sup 3}f++ is a modern tool for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. It is applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. The objective of this work is proving the capability of code d{sup 3}f++ to simulate correctly density-driven flow and pollutant transport in large scale, complex geological situations in order to improve the confidence in groundwater modeling in general. The applications presented in this report are related to haline and thermohydraulic groundwater flow and transport in porous or fractured media. Among them are laboratory and field experiments as well as real site studies. The d{sup 3}f++ results are verified by measurements or compared to the results of other density-driven flow codes. Three applications presented are related to Task 8 defined by the Task Force on Groundwater Flow and Transport of Solutes (TF GWFTS) of SKB to investigate the hydraulic interaction of the fractured, granitic host rock and the bentonite clay buffer in a deep geological repository at the Hard Rock Laboratory (HRL) at Aespoe. Presented are the results from work on the Buffer-Rock-Interaction-Experiment (BRIE) in the frame- work of Tasks 8c and 8d and on the Prototype Repository in the framework of Task 8e. Another application refers to a thermal injection and storage experiment in the Borden field research site. These works are focused on heat flow and free surface modeling. A 2d benchmark based on a laboratory experiment concerning formation and degradation of a freshwater gave the possibility to compare the results of various density-driven flow codes. The Waste Isolation Pilot Plant (WIPP) is a repository for transuranic waste in New Mexico, USA. A 6,000 km{sup 2}, basin scale model of the WIPP-Site overburden is present- ed here with the objective to

  7. Listening to professional voices: draft 2 of the ACM code of ethics and professional conduct

    OpenAIRE

    Flick, Catherine; Brinkman, Bo; Gotterbarn, D. W.; Miller, Keith; Vazansky, Kate; Wolf, Marty J.

    2017-01-01

    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. For the first time since 1992, the ACM Code of Ethics and Professional Conduct (the Code) is being updated. The Code Update Task Force in conjunction with the Committee on Professional Ethics is seeking advice from ACM members on the update. We indicated many of the motivations for changing the Code when we shared Draft 1 of Code 2018 with the ...

  8. Experimental transport analysis code system in JT-60

    International Nuclear Information System (INIS)

    Hirayama, Toshio; Shimizu, Katsuhiro; Tani, Keiji; Shirai, Hiroshi; Kikuchi, Mitsuru

    1988-03-01

    Transport analysis codes have been developed in order to study confinement properties related to particle and energy balance in ohmically and neutral beam heated plasmas of JT-60. The analysis procedure is divided into three steps as follows: 1) LOOK ; The shape of the plasma boundary is identified with a fast boundary identification code of FBI by using magnetic data, and flux surfaces are calculated with a MHD equilibrium code of SELENE. The diagnostic data are mapped to flux surfaces for neutral beam heating calculation and/or for radial transport analysis. 2) OFMC ; On the basis of transformed data, an orbit following Monte Carlo code of OFMC calculates both profiles of power deposition and particle source of neutral beam injected into a plasma. 3) SCOOP ; In the last stage, a one dimensional transport code of SCOOP solves particle and energy balance for electron and ion, in order to evaluate transport coefficients as well as global parameters such as energy confinement time and the stored energy. The analysis results are provided to a data bank of DARTS that is used to find an overview of important consideration on confinement with a regression analysis code of RAC. (author)

  9. Learning and coding in biological neural networks

    Science.gov (United States)

    Fiete, Ila Rani

    How can large groups of neurons that locally modify their activities learn to collectively perform a desired task? Do studies of learning in small networks tell us anything about learning in the fantastically large collection of neurons that make up a vertebrate brain? What factors do neurons optimize by encoding sensory inputs or motor commands in the way they do? In this thesis I present a collection of four theoretical works: each of the projects was motivated by specific constraints and complexities of biological neural networks, as revealed by experimental studies; together, they aim to partially address some of the central questions of neuroscience posed above. We first study the role of sparse neural activity, as seen in the coding of sequential commands in a premotor area responsible for birdsong. We show that the sparse coding of temporal sequences in the songbird brain can, in a network where the feedforward plastic weights must translate the sparse sequential code into a time-varying muscle code, facilitate learning by minimizing synaptic interference. Next, we propose a biologically plausible synaptic plasticity rule that can perform goal-directed learning in recurrent networks of voltage-based spiking neurons that interact through conductances. Learning is based on the correlation of noisy local activity with a global reward signal; we prove that this rule performs stochastic gradient ascent on the reward. Thus, if the reward signal quantifies network performance on some desired task, the plasticity rule provably drives goal-directed learning in the network. To assess the convergence properties of the learning rule, we compare it with a known example of learning in the brain. Song-learning in finches is a clear example of a learned behavior, with detailed available neurophysiological data. With our learning rule, we train an anatomically accurate model birdsong network that drives a sound source to mimic an actual zebrafinch song. Simulation and

  10. Central Decoding for Multiple Description Codes based on Domain Partitioning

    Directory of Open Access Journals (Sweden)

    M. Spiertz

    2006-01-01

    Full Text Available Multiple Description Codes (MDC can be used to trade redundancy against packet loss resistance for transmitting data over lossy diversity networks. In this work we focus on MD transform coding based on domain partitioning. Compared to Vaishampayan’s quantizer based MDC, domain based MD coding is a simple approach for generating different descriptions, by using different quantizers for each description. Commonly, only the highest rate quantizer is used for reconstruction. In this paper we investigate the benefit of using the lower rate quantizers to enhance the reconstruction quality at decoder side. The comparison is done on artificial source data and on image data. 

  11. Structured Low-Density Parity-Check Codes with Bandwidth Efficient Modulation

    Science.gov (United States)

    Cheng, Michael K.; Divsalar, Dariush; Duy, Stephanie

    2009-01-01

    In this work, we study the performance of structured Low-Density Parity-Check (LDPC) Codes together with bandwidth efficient modulations. We consider protograph-based LDPC codes that facilitate high-speed hardware implementations and have minimum distances that grow linearly with block sizes. We cover various higher- order modulations such as 8-PSK, 16-APSK, and 16-QAM. During demodulation, a demapper transforms the received in-phase and quadrature samples into reliability information that feeds the binary LDPC decoder. We will compare various low-complexity demappers and provide simulation results for assorted coded-modulation combinations on the additive white Gaussian noise and independent Rayleigh fading channels.

  12. Identification of tasks of maintenance centered in the reliability

    International Nuclear Information System (INIS)

    Torres V, A.; Rivero O, J.J.

    2004-01-01

    The methodology of Reliability Centered Maintenance (RCM) it has become, after the discovery of their advantages, an objective of many industrial facilities to optimize their maintenance. However, diverse subjective factors affect the determination of the parameters (technical of predictive to apply and times among interventions) that characterize the tasks of RCM. A method to determine the monitoring tasks at condition and the times more recommended for to apply the monitoring by time and the search of faults, with focus in system. This methodology has been computerized inside the code MOSEG Win Ver 1.0. The same has been applied with success to the determination of tasks of RCM in industrial objectives. (Author)

  13. A modified carrier-to-code leveling method for retrieving ionospheric observables and detecting short-term temporal variability of receiver differential code biases

    Science.gov (United States)

    Zhang, Baocheng; Teunissen, Peter J. G.; Yuan, Yunbin; Zhang, Xiao; Li, Min

    2018-03-01

    Sensing the ionosphere with the global positioning system involves two sequential tasks, namely the ionospheric observable retrieval and the ionospheric parameter estimation. A prominent source of error has long been identified as short-term variability in receiver differential code bias (rDCB). We modify the carrier-to-code leveling (CCL), a method commonly used to accomplish the first task, through assuming rDCB to be unlinked in time. Aside from the ionospheric observables, which are affected by, among others, the rDCB at one reference epoch, the Modified CCL (MCCL) can also provide the rDCB offsets with respect to the reference epoch as by-products. Two consequences arise. First, MCCL is capable of excluding the effects of time-varying rDCB from the ionospheric observables, which, in turn, improves the quality of ionospheric parameters of interest. Second, MCCL has significant potential as a means to detect between-epoch fluctuations experienced by rDCB of a single receiver.

  14. Transformation procedures in 3D terrestrial coordinate systems

    Directory of Open Access Journals (Sweden)

    Sedlák Vladimír

    2001-12-01

    Full Text Available Transformation procedures belong to the main tasks of surveyor working in a field of geodesy, for example in satellite geodesy or astronomical geodesy. It is necessary to know transformation procedures in 3D terrestrial (Earth coordinate systems. Increasingly a dynamic advance growth of application of satellite navigation systems, for example GPS (Global Positioning System into engineering surveying, real estate register and others spheres of applied geodesy and geo-surveying (mine surveying exacts knowledge of these transformation procedures between coordinates in various coordinate systems. These tasks are common for daily work for various practical surveyors too, not only for theoretical scientific working surveyors.Conventional Terrestrial System is 3D coordinate system what is the most important coordinate system in global geodesy. Conventional Terrestrial System is an approximation of the nature coordinate system of the Earth. The origin of this coordinate system is placed in the earth substantial centre of gravity and in the centre of geoid. Conventional Terrestrial System is the Cartesian right-handed coordinate system, i.e. positive one. The Local Astronomical System is 3D coordinate system too and it belongs to an important coordinate system in geodesy from its practical point of view. Many geodetic measurements are realized in this coordinate system. Designation of this coordinate system as astronomical system expresses its sticking to a normal line to an equipotential plane, i.e. to a vertical. Local Astronomical system is the left-handed cartesian coordinate system.Transformation procedures in 3D terrestrial coordinate systems with theory of these systems are presented in the paper. Transformation in the local astronomical coordinate system presents common transformation in a frame of an adjustment of various local geodetic networks. In a case of satellite measurements (GPS, satellite altimetry, etc. transformation between local and

  15. Cooling Characteristic Analysis of Transformer's Radiator

    International Nuclear Information System (INIS)

    Kim, Hyun Jae; Yang, Si Won; Kim, Won Seok; Kweon, Ki Yeoung; Lee, Min Jea

    2007-01-01

    A transformer is a device that changes the current and voltage by electricity induced between coil and core steel, and it is composed of metals and insulating materials. In the core of the transformer, the thermal load is generated by electric loss and the high temperature can make the break of insulating. So we must cool down the temperature of transformer by external radiators. According to cooling fan's usage, there are two cooling types, OA(Oil Natural Air Natural) and FA(Oil Natural Air Forced). For this study , we used Fluent 6.2 and analyzed the cooling characteristic of radiator. we calculated 1-fin of detail modeling that is similar to honeycomb structure and multi-fin(18-fin) calculation for OA and FA types. For the sensitivity study, we have different positions(side, under) of cooling fans for forced convection of FA type. The calculation results were compared with the measurement data which obtained from 135.45/69kV ultra transformer flowrate and temperature test. The aim of the study is to assess the Fluent code prediction on the radiator calculation and to use the data for optimizing transformer radiator design

  16. Heading for Success: Three Case Studies of School Transformation through Capital Construction

    Science.gov (United States)

    Chen, Wen-Yan; Pan, Hui-Ling Wendy

    2016-01-01

    Utilizing capital as a construct to analyze leadership that triggers school transformation is a newly emerged perspective. This study employed the capital theory as the framework to explore how schools undertook the transformative tasks by multi-case study. Three secondary schools in Taiwan were recruited to investigate how leaders constructed the…

  17. Data model description for the DESCARTES and CIDER codes

    International Nuclear Information System (INIS)

    Miley, T.B.; Ouderkirk, S.J.; Nichols, W.E.; Eslinger, P.W.

    1993-01-01

    The primary objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. One of the major objectives of the HEDR Project is to develop several computer codes to model the airborne releases. transport and envirorunental accumulation of radionuclides resulting from Hanford operations from 1944 through 1972. In July 1992, the HEDR Project Manager determined that the computer codes being developed (DESCARTES, calculation of environmental accumulation from airborne releases, and CIDER, dose calculations from environmental accumulation) were not sufficient to create accurate models. A team of HEDR staff members developed a plan to assure that computer codes would meet HEDR Project goals. The plan consists of five tasks: (1) code requirements definition. (2) scoping studies, (3) design specifications, (4) benchmarking, and (5) data modeling. This report defines the data requirements for the DESCARTES and CIDER codes

  18. Learning to Estimate Dynamical State with Probabilistic Population Codes.

    Directory of Open Access Journals (Sweden)

    Joseph G Makin

    2015-11-01

    Full Text Available Tracking moving objects, including one's own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF, the parameters of which can be learned via latent-variable density estimation (the EM algorithm. The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, "probabilistic population codes." We show that a recurrent neural network-a modified form of an exponential family harmonium (EFH-that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states.

  19. Transforming Losses―A Major Task of Spiritually Integrated Psychotherapy

    Directory of Open Access Journals (Sweden)

    Eckhard Frick

    2011-11-01

    Full Text Available Since Freud’s “Mourning and Melancholia”, bereavement encompasses the dilemma between continuing versus relinquishing bonds to deceased persons. Mourning is the process of symbolizing the loss, of making sense by facing the conflict between the absence of the lost object and the continuing presence of an emotional relationship to that which is lost. Furthermore, mourning is not limited to bereaved persons but also concerns dying persons and, in a broader sense, our whole symbolic life which is playful coping with a rhythm of absence and presence. True consolation connects the individual and the archetypical mourning. Spiritually integrated psychotherapy may accompany this process by amplification. Christian mysticism takes its starting point from the experience of Jesus Christ’s lost body, and this may be understood as a model of spiritual transformation.

  20. Coding visual features extracted from video sequences.

    Science.gov (United States)

    Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2014-05-01

    Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.

  1. Generating performance portable geoscientific simulation code with Firedrake (Invited)

    Science.gov (United States)

    Ham, D. A.; Bercea, G.; Cotter, C. J.; Kelly, P. H.; Loriant, N.; Luporini, F.; McRae, A. T.; Mitchell, L.; Rathgeber, F.

    2013-12-01

    This presentation will demonstrate how a change in simulation programming paradigm can be exploited to deliver sophisticated simulation capability which is far easier to programme than are conventional models, is capable of exploiting different emerging parallel hardware, and is tailored to the specific needs of geoscientific simulation. Geoscientific simulation represents a grand challenge computational task: many of the largest computers in the world are tasked with this field, and the requirements of resolution and complexity of scientists in this field are far from being sated. However, single thread performance has stalled, even sometimes decreased, over the last decade, and has been replaced by ever more parallel systems: both as conventional multicore CPUs and in the emerging world of accelerators. At the same time, the needs of scientists to couple ever-more complex dynamics and parametrisations into their models makes the model development task vastly more complex. The conventional approach of writing code in low level languages such as Fortran or C/C++ and then hand-coding parallelism for different platforms by adding library calls and directives forces the intermingling of the numerical code with its implementation. This results in an almost impossible set of skill requirements for developers, who must simultaneously be domain science experts, numericists, software engineers and parallelisation specialists. Even more critically, it requires code to be essentially rewritten for each emerging hardware platform. Since new platforms are emerging constantly, and since code owners do not usually control the procurement of the supercomputers on which they must run, this represents an unsustainable development load. The Firedrake system, conversely, offers the developer the opportunity to write PDE discretisations in the high-level mathematical language UFL from the FEniCS project (http://fenicsproject.org). Non-PDE model components, such as parametrisations

  2. Multitasking the three-dimensional transport code TORT on CRAY platforms

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    The multitasking options in the three-dimensional neutral particle transport code TORT originally implemented for Cray's CTSS operating system are revived and extended to run on Cray Y/MP and C90 computers using the UNICOS operating system. These include two coarse-grained domain decompositions; across octants, and across directions within an octant, termed Octant Parallel (OP), and Direction Parallel (DP), respectively. Parallel performance of the DP is significantly enhanced by increasing the task grain size and reducing load imbalance via dynamic scheduling of the discrete angles among the participating tasks. Substantial Wall Clock speedup factors, approaching 4.5 using 8 tasks, have been measured in a time-sharing environment, and generally depend on the test problem specifications, number of tasks, and machine loading during execution

  3. Colors and geometric forms in the work process information coding

    Directory of Open Access Journals (Sweden)

    Čizmić Svetlana

    2006-01-01

    Full Text Available The aim of the research was to establish the meaning of the colors and geometric shapes in transmitting information in the work process. The sample of 100 students connected 50 situations which could be associated with regular tasks in the work process with 12 colors and 4 geometric forms in previously chosen color. Based on chosen color-geometric shape-situation regulation, the idea of the research was to find out regularities in coding of information and to examine if those regularities can provide meaningful data assigned to each individual code and to explain which codes are better and applicable represents of examined situations.

  4. Quantitative code accuracy evaluation of ISP33

    Energy Technology Data Exchange (ETDEWEB)

    Kalli, H.; Miwrrin, A. [Lappeenranta Univ. of Technology (Finland); Purhonen, H. [VTT Energy, Lappeenranta (Finland)] [and others

    1995-09-01

    Aiming at quantifying code accuracy, a methodology based on the Fast Fourier Transform has been developed at the University of Pisa, Italy. The paper deals with a short presentation of the methodology and its application to pre-test and post-test calculations submitted to the International Standard Problem ISP33. This was a double-blind natural circulation exercise with a stepwise reduced primary coolant inventory, performed in PACTEL facility in Finland. PACTEL is a 1/305 volumetrically scaled, full-height simulator of the Russian type VVER-440 pressurized water reactor, with horizontal steam generators and loop seals in both cold and hot legs. Fifteen foreign organizations participated in ISP33, with 21 blind calculations and 20 post-test calculations, altogether 10 different thermal hydraulic codes and code versions were used. The results of the application of the methodology to nine selected measured quantities are summarized.

  5. Assessing the Role of Place and Timing Cues in Coding Frequency and Amplitude Modulation as a Function of Age.

    Science.gov (United States)

    Whiteford, Kelly L; Kreft, Heather A; Oxenham, Andrew J

    2017-08-01

    Natural sounds can be characterized by their fluctuations in amplitude and frequency. Ageing may affect sensitivity to some forms of fluctuations more than others. The present study used individual differences across a wide age range (20-79 years) to test the hypothesis that slow-rate, low-carrier frequency modulation (FM) is coded by phase-locked auditory-nerve responses to temporal fine structure (TFS), whereas fast-rate FM is coded via rate-place (tonotopic) cues, based on amplitude modulation (AM) of the temporal envelope after cochlear filtering. Using a low (500 Hz) carrier frequency, diotic FM and AM detection thresholds were measured at slow (1 Hz) and fast (20 Hz) rates in 85 listeners. Frequency selectivity and TFS coding were assessed using forward masking patterns and interaural phase disparity tasks (slow dichotic FM), respectively. Comparable interaural level disparity tasks (slow and fast dichotic AM and fast dichotic FM) were measured to control for effects of binaural processing not specifically related to TFS coding. Thresholds in FM and AM tasks were correlated, even across tasks thought to use separate peripheral codes. Age was correlated with slow and fast FM thresholds in both diotic and dichotic conditions. The relationship between age and AM thresholds was generally not significant. Once accounting for AM sensitivity, only diotic slow-rate FM thresholds remained significantly correlated with age. Overall, results indicate stronger effects of age on FM than AM. However, because of similar effects for both slow and fast FM when not accounting for AM sensitivity, the effects cannot be unambiguously ascribed to TFS coding.

  6. Dynamic-ETL: a hybrid approach for health data extraction, transformation and loading.

    Science.gov (United States)

    Ong, Toan C; Kahn, Michael G; Kwan, Bethany M; Yamashita, Traci; Brandt, Elias; Hosokawa, Patrick; Uhrich, Chris; Schilling, Lisa M

    2017-09-13

    Electronic health records (EHRs) contain detailed clinical data stored in proprietary formats with non-standard codes and structures. Participating in multi-site clinical research networks requires EHR data to be restructured and transformed into a common format and standard terminologies, and optimally linked to other data sources. The expertise and scalable solutions needed to transform data to conform to network requirements are beyond the scope of many health care organizations and there is a need for practical tools that lower the barriers of data contribution to clinical research networks. We designed and implemented a health data transformation and loading approach, which we refer to as Dynamic ETL (Extraction, Transformation and Loading) (D-ETL), that automates part of the process through use of scalable, reusable and customizable code, while retaining manual aspects of the process that requires knowledge of complex coding syntax. This approach provides the flexibility required for the ETL of heterogeneous data, variations in semantic expertise, and transparency of transformation logic that are essential to implement ETL conventions across clinical research sharing networks. Processing workflows are directed by the ETL specifications guideline, developed by ETL designers with extensive knowledge of the structure and semantics of health data (i.e., "health data domain experts") and target common data model. D-ETL was implemented to perform ETL operations that load data from various sources with different database schema structures into the Observational Medical Outcome Partnership (OMOP) common data model. The results showed that ETL rule composition methods and the D-ETL engine offer a scalable solution for health data transformation via automatic query generation to harmonize source datasets. D-ETL supports a flexible and transparent process to transform and load health data into a target data model. This approach offers a solution that lowers technical

  7. Deaf Children's Use of Phonological Coding: Evidence from Reading, Spelling, and Working Memory

    Science.gov (United States)

    Harris, Margaret; Moreno, Constanza

    2004-01-01

    Two groups of deaf children, aged 8 and 14 years, were presented with a number of tasks designed to assess their reliance on phonological coding. Their performance was compared with that of hearing children of the same chronological age (CA) and reading age (RA). Performance on the first task, short-term recall of pictures, showed that the deaf…

  8. Striatal dopamine release codes uncertainty in pathological gambling

    DEFF Research Database (Denmark)

    Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka

    2012-01-01

    Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain—striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear...... function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [11C]raclopride to measure...... dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand...

  9. Striatal dopamine release codes uncertainty in pathological gambling

    DEFF Research Database (Denmark)

    Linnet, Jakob; Mouridsen, Kim; Peterson, Ericka

    2012-01-01

    Two mechanisms of midbrain and striatal dopaminergic projections may be involved in pathological gambling: hypersensitivity to reward and sustained activation toward uncertainty. The midbrain-striatal dopamine system distinctly codes reward and uncertainty, where dopaminergic activation is a linear...... function of expected reward and an inverse U-shaped function of uncertainty. In this study, we investigated the dopaminergic coding of reward and uncertainty in 18 pathological gambling sufferers and 16 healthy controls. We used positron emission tomography (PET) with the tracer [(11)C......]raclopride to measure dopamine release, and we used performance on the Iowa Gambling Task (IGT) to determine overall reward and uncertainty. We hypothesized that we would find a linear function between dopamine release and IGT performance, if dopamine release coded reward in pathological gambling. If, on the other hand...

  10. Transformational leadership in nursing: a concept analysis.

    Science.gov (United States)

    Fischer, Shelly A

    2016-11-01

    To analyse the concept of transformational leadership in the nursing context. Tasked with improving patient outcomes while decreasing the cost of care provision, nurses need strategies for implementing reform in health care and one promising strategy is transformational leadership. Exploration and greater understanding of transformational leadership and the potential it holds is integral to performance improvement and patient safety. Concept analysis using Walker and Avant's (2005) concept analysis method. PubMed, CINAHL and PsychINFO. This report draws on extant literature on transformational leadership, management, and nursing to effectively analyze the concept of transformational leadership in the nursing context. This report proposes a new operational definition for transformational leadership and identifies model cases and defining attributes that are specific to the nursing context. The influence of transformational leadership on organizational culture and patient outcomes is evident. Of particular interest is the finding that transformational leadership can be defined as a set of teachable competencies. However, the mechanism by which transformational leadership influences patient outcomes remains unclear. Transformational leadership in nursing has been associated with high-performing teams and improved patient care, but rarely has it been considered as a set of competencies that can be taught. Also, further research is warranted to strengthen empirical referents; this can be done by improving the operational definition, reducing ambiguity in key constructs and exploring the specific mechanisms by which transformational leadership influences healthcare outcomes to validate subscale measures. © 2016 John Wiley & Sons Ltd.

  11. WAVEMOTH-FAST SPHERICAL HARMONIC TRANSFORMS BY BUTTERFLY MATRIX COMPRESSION

    International Nuclear Information System (INIS)

    Seljebotn, D. S.

    2012-01-01

    We present Wavemoth, an experimental open source code for computing scalar spherical harmonic transforms (SHTs). Such transforms are ubiquitous in astronomical data analysis. Our code performs substantially better than existing publicly available codes owing to improvements on two fronts. First, the computational core is made more efficient by using small amounts of pre-computed data, as well as paying attention to CPU instruction pipelining and cache usage. Second, Wavemoth makes use of a fast and numerically stable algorithm based on compressing a set of linear operators in a pre-computation step. The resulting SHT scales as O(L 2 log 2 L) for the resolution range of practical interest, where L denotes the spherical harmonic truncation degree. For low- and medium-range resolutions, Wavemoth tends to be twice as fast as libpsht, which is the current state-of-the-art implementation for the HEALPix grid. At the resolution of the Planck experiment, L ∼ 4000, Wavemoth is between three and six times faster than libpsht, depending on the computer architecture and the required precision. Because of the experimental nature of the project, only spherical harmonic synthesis is currently supported, although adding support for spherical harmonic analysis should be trivial.

  12. Utilization of MCNP code in the research and design for China advanced research reactor

    International Nuclear Information System (INIS)

    Shen Feng

    2006-01-01

    MCNP, which is the internationalized neutronics code, is used for nuclear research and design in China Advanced Research Reactor (CARR). MCNP is an important neutronics code in the research and design for CARR since many calculation tasks could be undertaken by it. Many nuclear parameters on reactor core, the design and optimization research for many reactor utilizations, much verification for other nuclear calculation code and so on are conducted with help of MCNP. (author)

  13. Comprehensive Report For Proposed Elevated Temperature Elastic Perfectly Plastic (EPP) Code Cases Representative Example Problems

    Energy Technology Data Exchange (ETDEWEB)

    Hollinger, Greg L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-06-01

    Background: The current rules in the nuclear section of the ASME Boiler and Pressure Vessel (B&PV) Code , Section III, Subsection NH for the evaluation of strain limits and creep-fatigue damage using simplified methods based on elastic analysis have been deemed inappropriate for Alloy 617 at temperatures above 1200F (650C)1. To address this issue, proposed code rules have been developed which are based on the use of elastic-perfectly plastic (E-PP) analysis methods and which are expected to be applicable to very high temperatures. The proposed rules for strain limits and creep-fatigue evaluation were initially documented in the technical literature 2, 3, and have been recently revised to incorporate comments and simplify their application. The revised code cases have been developed. Task Objectives: The goal of the Sample Problem task is to exercise these code cases through example problems to demonstrate their feasibility and, also, to identify potential corrections and improvements should problems be encountered. This will provide input to the development of technical background documents for consideration by the applicable B&PV committees considering these code cases for approval. This task has been performed by Hollinger and Pease of Becht Engineering Co., Inc., Nuclear Services Division and a report detailing the results of the E-PP analyses conducted on example problems per the procedures of the E-PP strain limits and creep-fatigue draft code cases is enclosed as Enclosure 1. Conclusions: The feasibility of the application of the E-PP code cases has been demonstrated through example problems that consist of realistic geometry (a nozzle attached to a semi-hemispheric shell with a circumferential weld) and load (pressure; pipe reaction load applied at the end of the nozzle, including axial and shear forces, bending and torsional moments; through-wall transient temperature gradient) and design and operating conditions (Levels A, B and C).

  14. Perceptual Coding of Audio Signals Using Adaptive Time-Frequency Transform

    Directory of Open Access Journals (Sweden)

    Umapathy Karthikeyan

    2007-01-01

    Full Text Available Wide band digital audio signals have a very high data-rate associated with them due to their complex nature and demand for high-quality reproduction. Although recent technological advancements have significantly reduced the cost of bandwidth and miniaturized storage facilities, the rapid increase in the volume of digital audio content constantly compels the need for better compression algorithms. Over the years various perceptually lossless compression techniques have been introduced, and transform-based compression techniques have made a significant impact in recent years. In this paper, we propose one such transform-based compression technique, where the joint time-frequency (TF properties of the nonstationary nature of the audio signals were exploited in creating a compact energy representation of the signal in fewer coefficients. The decomposition coefficients were processed and perceptually filtered to retain only the relevant coefficients. Perceptual filtering (psychoacoustics was applied in a novel way by analyzing and performing TF specific psychoacoustics experiments. An added advantage of the proposed technique is that, due to its signal adaptive nature, it does not need predetermined segmentation of audio signals for processing. Eight stereo audio signal samples of different varieties were used in the study. Subjective (mean opinion score—MOS listening tests were performed and the subjective difference grades (SDG were used to compare the performance of the proposed coder with MP3, AAC, and HE-AAC encoders. Compression ratios in the range of 8 to 40 were achieved by the proposed technique with subjective difference grades (SDG ranging from –0.53 to –2.27.

  15. Performance simulation of an absorption heat transformer operating with partially miscible mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, D.; Cachot, T.; Hornut, J.M. [LSGC-CNRS-ENSIC, Nancy (France); Univ. Henri Poincare, Nancy (France). IUT

    2002-07-08

    This paper proposes to study the thermodynamics performances of a new absorption heat-transformer cycle, where the separation step is obtained by the cooling and settling of a partially miscible mixture at low temperature. This new cycle has been called an absorption-demixing heat transformer (ADHT) cycle. A numerical simulation code has been written, and has allowed us to evaluate the temperature lift and thermal yield of 2 working pairs. Both high qualitative and quantitative performances have been obtained, so demonstrating the feasibility and industrial interest for such a cycle. Moreover a comparison of the simulation results with performances really obtained on an experimental ADHT has confirmed the pertinence of the simulation code.(author)

  16. Task exposures in an office environment: a comparison of methods.

    Science.gov (United States)

    Van Eerd, Dwayne; Hogg-Johnson, Sheilah; Mazumder, Anjali; Cole, Donald; Wells, Richard; Moore, Anne

    2009-10-01

    Task-related factors such as frequency and duration are associated with musculoskeletal disorders in office settings. The primary objective was to compare various task recording methods as measures of exposure in an office workplace. A total of 41 workers from different jobs were recruited from a large urban newspaper (71% female, mean age 41 years SD 9.6). Questionnaire, task diaries, direct observation and video methods were used to record tasks. A common set of task codes was used across methods. Different estimates of task duration, number of tasks and task transitions arose from the different methods. Self-report methods did not consistently result in longer task duration estimates. Methodological issues could explain some of the differences in estimates seen between methods observed. It was concluded that different task recording methods result in different estimates of exposure likely due to different exposure constructs. This work addresses issues of exposure measurement in office environments. It is of relevance to ergonomists/researchers interested in how to best assess the risk of injury among office workers. The paper discusses the trade-offs between precision, accuracy and burden in the collection of computer task-based exposure measures and different underlying constructs captures in each method.

  17. Transforming the Subject Matter: Examining the Intellectual Roots of Pedagogical Content Knowledge

    Science.gov (United States)

    Deng, Zongyi

    2007-01-01

    This article questions the basic assumptions of pedagogical content knowledge by analyzing the ideas of Jerome Bruner, Joseph Schwab, and John Dewey concerning transforming the subject matter. It argues that transforming the subject matter is not only a pedagogical but also a complex curricular task in terms of developing a school subject or a…

  18. Using Transformations in the Implementation of Higher-order Functions

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    1991-01-01

    Many different techniques are needed in an optimising compiler-constant folding, program transformation, semantic analysis and optimisation, code generation and so on. The authors propose a uniform framework for all these activities. Every computation in the program to be compiled is classified...

  19. Artificial Intelligence for Pathologists Is Not Near--It Is Here: Description of a Prototype That Can Transform How We Practice Pathology Tomorrow.

    Science.gov (United States)

    Ye, Jay J

    2015-07-01

    Pathologists' daily tasks consist of both the professional interpretation of slides and the secretarial tasks of translating these interpretations into final pathology reports, the latter of which is a time-consuming endeavor for most pathologists. To describe an artificial intelligence that performs secretarial tasks, designated as Secretary-Mimicking Artificial Intelligence (SMILE). The underling implementation of SMILE is a collection of computer programs that work in concert to "listen to" the voice commands and to "watch for" the changes of windows caused by slide bar code scanning; SMILE responds to these inputs by acting upon PowerPath Client windows (Sunquest Information Systems, Tucson, Arizona) and its Microsoft Word (Microsoft, Redmond, Washington) Add-In window, eventuating in the reports being typed and finalized. Secretary-Mimicking Artificial Intelligence also communicates relevant information to the pathologist via the computer speakers and message box on the screen. Secretary-Mimicking Artificial Intelligence performs many secretarial tasks intelligently and semiautonomously, with rapidity and consistency, thus enabling pathologists to focus on slide interpretation, which results in a marked increase in productivity, decrease in errors, and reduction of stress in daily practice. Secretary-Mimicking Artificial Intelligence undergoes encounter-based learning continually, resulting in a continuous improvement in its knowledge-based intelligence. Artificial intelligence for pathologists is both feasible and powerful. The future widespread use of artificial intelligence in our profession is certainly going to transform how we practice pathology.

  20. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    Science.gov (United States)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  1. Row Reduction Applied to Decoding of Rank Metric and Subspace Codes

    DEFF Research Database (Denmark)

    Puchinger, Sven; Nielsen, Johan Sebastian Rosenkilde; Li, Wenhui

    2017-01-01

    We show that decoding of ℓ-Interleaved Gabidulin codes, as well as list-ℓ decoding of Mahdavifar–Vardy (MV) codes can be performed by row reducing skew polynomial matrices. Inspired by row reduction of F[x] matrices, we develop a general and flexible approach of transforming matrices over skew...... polynomial rings into a certain reduced form. We apply this to solve generalised shift register problems over skew polynomial rings which occur in decoding ℓ-Interleaved Gabidulin codes. We obtain an algorithm with complexity O(ℓμ2) where μ measures the size of the input problem and is proportional...... to the code length n in the case of decoding. Further, we show how to perform the interpolation step of list-ℓ-decoding MV codes in complexity O(ℓn2), where n is the number of interpolation constraints....

  2. Robust Learning Control Design for Quantum Unitary Transformations.

    Science.gov (United States)

    Wu, Chengzhi; Qi, Bo; Chen, Chunlin; Dong, Daoyi

    2017-12-01

    Robust control design for quantum unitary transformations has been recognized as a fundamental and challenging task in the development of quantum information processing due to unavoidable decoherence or operational errors in the experimental implementation of quantum operations. In this paper, we extend the systematic methodology of sampling-based learning control (SLC) approach with a gradient flow algorithm for the design of robust quantum unitary transformations. The SLC approach first uses a "training" process to find an optimal control strategy robust against certain ranges of uncertainties. Then a number of randomly selected samples are tested and the performance is evaluated according to their average fidelity. The approach is applied to three typical examples of robust quantum transformation problems including robust quantum transformations in a three-level quantum system, in a superconducting quantum circuit, and in a spin chain system. Numerical results demonstrate the effectiveness of the SLC approach and show its potential applications in various implementation of quantum unitary transformations.

  3. Fourier spectral of PalmCode as descriptor for palmprint recognition

    NARCIS (Netherlands)

    Ruan, Qiuqi; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Mu, Meiru

    Study on automatic person recognition by palmprint is currently a hot topic. In this paper, we propose a novel palmprint recognition method by transforming the typical palmprint phase code feature into its Fourier frequency domain. The resulting real-valued Fourier spectral features are further

  4. An Efficient SF-ISF Approach for the Slepian-Wolf Source Coding Problem

    Directory of Open Access Journals (Sweden)

    Tu Zhenyu

    2005-01-01

    Full Text Available A simple but powerful scheme exploiting the binning concept for asymmetric lossless distributed source coding is proposed. The novelty in the proposed scheme is the introduction of a syndrome former (SF in the source encoder and an inverse syndrome former (ISF in the source decoder to efficiently exploit an existing linear channel code without the need to modify the code structure or the decoding strategy. For most channel codes, the construction of SF-ISF pairs is a light task. For parallelly and serially concatenated codes and particularly parallel and serial turbo codes where this appear less obvious, an efficient way for constructing linear complexity SF-ISF pairs is demonstrated. It is shown that the proposed SF-ISF approach is simple, provenly optimal, and generally applicable to any linear channel code. Simulation using conventional and asymmetric turbo codes demonstrates a compression rate that is only 0.06 bit/symbol from the theoretical limit, which is among the best results reported so far.

  5. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  6. Task Requirements Influence Sensory Integration during Grasping in Humans

    Science.gov (United States)

    Safstrom, Daniel; Edin, Benoni B.

    2004-01-01

    The sensorimotor transformations necessary for generating appropriate motor commands depend on both current and previously acquired sensory information. To investigate the relative impact (or weighting) of visual and haptic information about object size during grasping movements, we let normal subjects perform a task in which, unbeknownst to the…

  7. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  8. Selective Extraction of Entangled Textures via Adaptive PDE Transform

    Directory of Open Access Journals (Sweden)

    Yang Wang

    2012-01-01

    Full Text Available Texture and feature extraction is an important research area with a wide range of applications in science and technology. Selective extraction of entangled textures is a challenging task due to spatial entanglement, orientation mixing, and high-frequency overlapping. The partial differential equation (PDE transform is an efficient method for functional mode decomposition. The present work introduces adaptive PDE transform algorithm to appropriately threshold the statistical variance of the local variation of functional modes. The proposed adaptive PDE transform is applied to the selective extraction of entangled textures. Successful separations of human face, clothes, background, natural landscape, text, forest, camouflaged sniper and neuron skeletons have validated the proposed method.

  9. Remote one-qubit information concentration and decoding of operator quantum error-correction codes

    International Nuclear Information System (INIS)

    Hsu Liyi

    2007-01-01

    We propose the general scheme of remote one-qubit information concentration. To achieve the task, the Bell-correlated mixed states are exploited. In addition, the nonremote one-qubit information concentration is equivalent to the decoding of the quantum error-correction code. Here we propose how to decode the stabilizer codes. In particular, the proposed scheme can be used for the operator quantum error-correction codes. The encoded state can be recreated on the errorless qubit, regardless how many bit-flip errors and phase-flip errors have occurred

  10. Implementation of the SCDAP/RELAP5 Mod. 3.3 and MAAP/VVER codes

    International Nuclear Information System (INIS)

    Duspiva, J.; Vokac, P.; Dienstbier, J.

    2001-05-01

    The SR5 code was installed on a Hewlett/Packard workstation, and test problems, supplied with the software, were solved. Finally, the tool for graphical processing of the calculation results was prepared and tested. The MAAP/VVER code was installed on a HP J210 workstation and, in particular, on PC. The code was tested on two problems, supplied with the software. The transformation of the output from MAAP/VVER to the graphical format was carried out by using the support tools obtained as well as by using tools that have been in use at the Institute for other codes to analyze severe accidents. (P.A.)

  11. Code Betal to calculation Alpha/Beta activities in environmental samples

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    A codes, BETAL, was developed, written in FORTRAN IV, to automatize calculations and presentations of the result of the total alpha-beta activities measurements in environmental samples. This code performs the necessary calculations for transformation the activities measured in total counts, to pCi/1., bearing in mind the efficiency of the detector used and the other necessary parameters. Further more, it appraise the standard deviation of the result, and calculus the Lower limit of detection for each measurement. This code is written in iterative way by screen-operator dialogue, and asking the necessary data to perform the calculation of the activity in each case by a screen label. The code could be executed through any screen and keyboard terminal, (whose computer accepts Fortran IV) with a printer connected to the said computer. (Author) 5 refs

  12. Stroop-like effects in a new-code learning task: A cognitive load theory perspective.

    Science.gov (United States)

    Hazan-Liran, Batel; Miller, Paul

    2017-09-01

    To determine whether and how learning is biased by competing task-irrelevant information that creates extraneous cognitive load, we assessed the efficiency of university students with a learning paradigm in two experiments. The paradigm asked participants to learn associations between eight words and eight digits. We manipulated congruity of the digits' ink colour with the words' semantics. In Experiment 1 word stimuli were colour words (e.g., blue, yellow) and in Experiment 2 colour-related word concepts (e.g., sky, banana). Marked benefits and costs on learning due to variation in extraneous cognitive load originating from processing task-irrelevant information were evident. Implications for cognitive load theory and schooling are discussed.

  13. Perceptual Coding of Audio Signals Using Adaptive Time-Frequency Transform

    Directory of Open Access Journals (Sweden)

    Karthikeyan Umapathy

    2007-08-01

    Full Text Available Wide band digital audio signals have a very high data-rate associated with them due to their complex nature and demand for high-quality reproduction. Although recent technological advancements have significantly reduced the cost of bandwidth and miniaturized storage facilities, the rapid increase in the volume of digital audio content constantly compels the need for better compression algorithms. Over the years various perceptually lossless compression techniques have been introduced, and transform-based compression techniques have made a significant impact in recent years. In this paper, we propose one such transform-based compression technique, where the joint time-frequency (TF properties of the nonstationary nature of the audio signals were exploited in creating a compact energy representation of the signal in fewer coefficients. The decomposition coefficients were processed and perceptually filtered to retain only the relevant coefficients. Perceptual filtering (psychoacoustics was applied in a novel way by analyzing and performing TF specific psychoacoustics experiments. An added advantage of the proposed technique is that, due to its signal adaptive nature, it does not need predetermined segmentation of audio signals for processing. Eight stereo audio signal samples of different varieties were used in the study. Subjective (mean opinion score—MOS listening tests were performed and the subjective difference grades (SDG were used to compare the performance of the proposed coder with MP3, AAC, and HE-AAC encoders. Compression ratios in the range of 8 to 40 were achieved by the proposed technique with subjective difference grades (SDG ranging from –0.53 to –2.27.

  14. When what's left is right: visuomotor transformations in an aged population.

    Directory of Open Access Journals (Sweden)

    Lee A Baugh

    Full Text Available BACKGROUND: There has been little consensus as to whether age-related visuomotor adaptation effects are readily observable. Some studies have found slower adaptation, and/or reduced overall levels. In contrast, other methodologically similar studies have found no such evidence of aging effects on visuomotor adaptation. A crucial early step in successful adaptation is the ability to perform the necessary transformation to complete the task at hand. The present study describes the use of a viewing window paradigm to examine the effects of aging in a visuomotor transformation task. METHODS: Two groups of participants, a young adult control group (age range 18-33 years old, mean age = 22 and an older adult group (age range 62-74, mean age = 68 completed a viewing window task that was controlled by the user via a computer touchscreen. Four visuomotor "flip" conditions were created by varying the relationship between the participant's movement, and the resultant on-screen movement of the viewing window: 1 No flip 2 X-Axis and Y-axis body movements resulted in the opposite direction of movement of the viewing window. In each of the 3 Flip-X and 4 Flip-Y conditions, the solitary X- or Y-axes were reversed. Response times and movement of the window were recorded. CONCLUSIONS: Older participants demonstrated impairments in performing a required visuomotor transformation, as evidenced by more complex scanning patterns and longer scanning times when compared to younger control participants. These results provide additional evidence that the mechanisms involved in visuomotor transformation are negatively affected by age.

  15. Igo - A Monte Carlo Code For Radiotherapy Planning

    International Nuclear Information System (INIS)

    Goldstein, M.; Regev, D.

    1999-01-01

    The goal of radiation therapy is to deliver a lethal dose to the tumor, while minimizing the dose to normal tissues and vital organs. To carry out this task, it is critical to calculate correctly the 3-D dose delivered. Monte Carlo transport methods (especially the Adjoint Monte Carlo have the potential to provide more accurate predictions of the 3-D dose the currently used methods. IG0 is a Monte Carlo code derived from the general Monte Carlo Program - MCNP, tailored specifically for calculating the effects of radiation therapy. This paper describes the IG0 transport code, the PIG0 interface and some preliminary results

  16. An Optimization Algorithm for Multipath Parallel Allocation for Service Resource in the Simulation Task Workflow

    Directory of Open Access Journals (Sweden)

    Zhiteng Wang

    2014-01-01

    Full Text Available Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimization algorithm for multipath service resource parallel allocation, in which multipath service resource parallel allocation model is built and multiple chains coding scheme quantum optimization algorithm is used for optimization and solution. The multiple chains coding scheme quantum optimization algorithm is to extend parallel search space to improve search efficiency. Through the simulation experiment, this paper investigates the effect for the probability of success in simulation task workflow from different optimization algorithm, service allocation strategy, and path number, and the simulation result shows that the optimization algorithm for multipath service resource parallel allocation is an effective method to improve the probability of success and timeliness in simulation task workflow.

  17. PERMUTATION-BASED POLYMORPHIC STEGO-WATERMARKS FOR PROGRAM CODES

    Directory of Open Access Journals (Sweden)

    Denys Samoilenko

    2016-06-01

    Full Text Available Purpose: One of the most actual trends in program code protection is code marking. The problem consists in creation of some digital “watermarks” which allow distinguishing different copies of the same program codes. Such marks could be useful for authority protection, for code copies numbering, for program propagation monitoring, for information security proposes in client-server communication processes. Methods: We used the methods of digital steganography adopted for program codes as text objects. The same-shape symbols method was transformed to same-semantic element method due to codes features which makes them different from ordinary texts. We use dynamic principle of marks forming making codes similar to be polymorphic. Results: We examined the combinatorial capacity of permutations possible in program codes. As a result it was shown that the set of 5-7 polymorphic variables is suitable for the most modern network applications. Marks creation and restoration algorithms where proposed and discussed. The main algorithm is based on full and partial permutations in variables names and its declaration order. Algorithm for partial permutation enumeration was optimized for calculation complexity. PHP code fragments which realize the algorithms were listed. Discussion: Methodic proposed in the work allows distinguishing of each client-server connection. In a case if a clone of some network resource was found the methodic could give information about included marks and thereby data on IP, date and time, authentication information of client copied the resource. Usage of polymorphic stego-watermarks should improve information security indexes in network communications.

  18. Inclusion of models to describe severe accident conditions in the fuel simulation code DIONISIO

    Energy Technology Data Exchange (ETDEWEB)

    Lemes, Martín; Soba, Alejandro [Sección Códigos y Modelos, Gerencia Ciclo del Combustible Nuclear, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina); Daverio, Hernando [Gerencia Reactores y Centrales Nucleares, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina); Denis, Alicia [Sección Códigos y Modelos, Gerencia Ciclo del Combustible Nuclear, Comisión Nacional de Energía Atómica, Avenida General Paz 1499, 1650 San Martín, Provincia de Buenos Aires (Argentina)

    2017-04-15

    The simulation of fuel rod behavior is a complex task that demands not only accurate models to describe the numerous phenomena occurring in the pellet, cladding and internal rod atmosphere but also an adequate interconnection between them. In the last years several models have been incorporated to the DIONISIO code with the purpose of increasing its precision and reliability. After the regrettable events at Fukushima, the need for codes capable of simulating nuclear fuels under accident conditions has come forth. Heat removal occurs in a quite different way than during normal operation and this fact determines a completely new set of conditions for the fuel materials. A detailed description of the different regimes the coolant may exhibit in such a wide variety of scenarios requires a thermal-hydraulic formulation not suitable to be included in a fuel performance code. Moreover, there exist a number of reliable and famous codes that perform this task. Nevertheless, and keeping in mind the purpose of building a code focused on the fuel behavior, a subroutine was developed for the DIONISIO code that performs a simplified analysis of the coolant in a PWR, restricted to the more representative situations and provides to the fuel simulation the boundary conditions necessary to reproduce accidental situations. In the present work this subroutine is described and the results of different comparisons with experimental data and with thermal-hydraulic codes are offered. It is verified that, in spite of its comparative simplicity, the predictions of this module of DIONISIO do not differ significantly from those of the specific, complex codes.

  19. Hello Ruby adventures in coding

    CERN Document Server

    Liukas, Linda

    2015-01-01

    "Code is the 21st century literacy and the need for people to speak the ABCs of Programming is imminent." --Linda Liukas Meet Ruby--a small girl with a huge imagination. In Ruby's world anything is possible if you put your mind to it. When her dad asks her to find five hidden gems Ruby is determined to solve the puzzle with the help of her new friends, including the Wise Snow Leopard, the Friendly Foxes, and the Messy Robots. As Ruby stomps around her world kids will be introduced to the basic concepts behind coding and programming through storytelling. Learn how to break big problems into small problems, repeat tasks, look for patterns, create step-by-step plans, and think outside the box. With hands-on activities included in every chapter, future coders will be thrilled to put their own imaginations to work.

  20. Implementing a modular system of computer codes

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.

    1983-07-01

    A modular computation system has been developed for nuclear reactor core analysis. The codes can be applied repeatedly in blocks without extensive user input data, as needed for reactor history calculations. The primary control options over the calculational paths and task assignments within the codes are blocked separately from other instructions, admitting ready access by user input instruction or directions from automated procedures and promoting flexible and diverse applications at minimum application cost. Data interfacing is done under formal specifications with data files manipulated by an informed manager. This report emphasizes the system aspects and the development of useful capability, hopefully informative and useful to anyone developing a modular code system of much sophistication. Overall, this report in a general way summarizes the many factors and difficulties that are faced in making reactor core calculations, based on the experience of the authors. It provides the background on which work on HTGR reactor physics is being carried out

  1. MARE2DEM: a 2-D inversion code for controlled-source electromagnetic and magnetotelluric data

    Science.gov (United States)

    Key, Kerry

    2016-10-01

    This work presents MARE2DEM, a freely available code for 2-D anisotropic inversion of magnetotelluric (MT) data and frequency-domain controlled-source electromagnetic (CSEM) data from onshore and offshore surveys. MARE2DEM parametrizes the inverse model using a grid of arbitrarily shaped polygons, where unstructured triangular or quadrilateral grids are typically used due to their ease of construction. Unstructured grids provide significantly more geometric flexibility and parameter efficiency than the structured rectangular grids commonly used by most other inversion codes. Transmitter and receiver components located on topographic slopes can be tilted parallel to the boundary so that the simulated electromagnetic fields accurately reproduce the real survey geometry. The forward solution is implemented with a goal-oriented adaptive finite-element method that automatically generates and refines unstructured triangular element grids that conform to the inversion parameter grid, ensuring accurate responses as the model conductivity changes. This dual-grid approach is significantly more efficient than the conventional use of a single grid for both the forward and inverse meshes since the more detailed finite-element meshes required for accurate responses do not increase the memory requirements of the inverse problem. Forward solutions are computed in parallel with a highly efficient scaling by partitioning the data into smaller independent modeling tasks consisting of subsets of the input frequencies, transmitters and receivers. Non-linear inversion is carried out with a new Occam inversion approach that requires fewer forward calls. Dense matrix operations are optimized for memory and parallel scalability using the ScaLAPACK parallel library. Free parameters can be bounded using a new non-linear transformation that leaves the transformed parameters nearly the same as the original parameters within the bounds, thereby reducing non-linear smoothing effects. Data

  2. A seismic data compression system using subband coding

    Science.gov (United States)

    Kiely, A. B.; Pollara, F.

    1995-01-01

    This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  3. Method of local pointed function reduction of original shape in Fourier transformation

    International Nuclear Information System (INIS)

    Dosch, H.; Slavyanov, S.Yu.

    2002-01-01

    The method for analytical reduction of the original shape in the one-dimensional Fourier transformation by the fourier image modulus is proposed. The basic concept of the method consists in the presentation of the model shape in the form of the local peak functions sum. The eigenfunctions, generated by the linear differential equations with the polynomial coefficients, are selected as the latter ones. This provides for the possibility of managing the Fourier transformation without numerical integration. This reduces the reverse task to the nonlinear regression with a small number of the evaluated parameters and to the numerical or asymptotic study on the model peak functions - the eigenfunctions of the differential tasks and their fourier images [ru

  4. Developing and modifying behavioral coding schemes in pediatric psychology: a practical guide.

    Science.gov (United States)

    Chorney, Jill MacLaren; McMurtry, C Meghan; Chambers, Christine T; Bakeman, Roger

    2015-01-01

    To provide a concise and practical guide to the development, modification, and use of behavioral coding schemes for observational data in pediatric psychology. This article provides a review of relevant literature and experience in developing and refining behavioral coding schemes. A step-by-step guide to developing and/or modifying behavioral coding schemes is provided. Major steps include refining a research question, developing or refining the coding manual, piloting and refining the coding manual, and implementing the coding scheme. Major tasks within each step are discussed, and pediatric psychology examples are provided throughout. Behavioral coding can be a complex and time-intensive process, but the approach is invaluable in allowing researchers to address clinically relevant research questions in ways that would not otherwise be possible. © The Author 2014. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Steady-State Calculation of the ATLAS Test Facility Using the SPACE Code

    International Nuclear Information System (INIS)

    Kim, Hyoung Tae; Choi, Ki Yong; Kim, Kyung Doo

    2011-01-01

    The Korean nuclear industry is developing a thermalhydraulic analysis code for safety analysis of pressurized water reactors (PWRs). The new code is called the Safety and Performance Analysis Code for Nuclear Power Plants (SPACE). Several research and industrial organizations including KAERI (Korea Atomic Energy Research Institute) are participating in the collaboration for the development of the SPACE code. One of the main tasks of KAERI is to carry out separate effect tests (SET) and integral effect tests (IET) for code verification and validation (V and V). The IET has been performed with ATLAS (Advanced Thermalhydraulic Test Loop for Accident Simulation) based on the design features of the APR1400 (Advanced Power Reactor of 1400MWe). In the present work the SPACE code input-deck for ATLAS is developed and used for simulation of the steady-state conditions of ATLAS as a preliminary work for IET V and V of the SPACE code

  6. GOOD GOVERNANCE AND TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Hans-Jürgen WAGENER

    2005-12-01

    Full Text Available Transformation of a totalitarian, basically administratively coordinated system into a democratic one that is coordinated predominantly by markets and competition has been triggered by, among others, the perception of a serious deficit in welfare and happiness. Public policy has a special task transforming the economic order by liberalisation, privatisation, stabilisation and the installation of institutions that are supportive for competition. After 15 years since transformation began, there are sufficiently differentiated success stories to test the hypothesis: it was good governance that is responsible for success and bad governance for failure. The empirical results support the “Lorenzetti hypothesis”: where freedom, security and trust prevail, the economy flourishes, where they are lacking, the costs of long-term investment are too high. The initial conditions of transition countries seem to be quite similar, nevertheless, even there one can discern good and bad governance. The extent of socialist lawfulness, planning security, cronyism and corruption differed widely between East Berlin and Tashkent. And a good deal of such variations can be found in the pre-socialist history of these countries. However, the main conclusion is that the co-evolution hypothesis states that both, welfare and good governance, go together.

  7. Complex neural codes in rat prelimbic cortex are stable across days on a spatial decision task

    Directory of Open Access Journals (Sweden)

    Nathaniel J. Powell

    2014-04-01

    Full Text Available The rodent prelimbic cortex has been shown to play an important role in cognitive processing, and has been implicated in encoding many different parameters relevant to solving decision-making tasks. However, it is not known how the prelimbic cortex represents all these disparate variables, and if they are simultaneously represented when the task requires it. In order to investigate this question, we trained rats to run the Multiple-T Left Right Alternate (MT-LRA task and recorded multi-unit ensembles from their prelimbic regions. Significant populations of cells in the prelimbic cortex represented the strategy controlling reward receipt on a given lap, whether the animal chose to go right or left on a given lap, and whether the animal made a correct decision or an error on a given lap. These populations overlapped in the cells recorded, with several cells demonstrating differential firing to all three variables. The spatial and strategic firing patterns of individual prelimbic cells were highly conserved across several days of running this task, indicating that each cell encoded the same information across days.

  8. Remembering to learn: independent place and journey coding mechanisms contribute to memory transfer.

    Science.gov (United States)

    Bahar, Amir S; Shapiro, Matthew L

    2012-02-08

    The neural mechanisms that integrate new episodes with established memories are unknown. When rats explore an environment, CA1 cells fire in place fields that indicate locations. In goal-directed spatial memory tasks, some place fields differentiate behavioral histories ("journey-dependent" place fields) while others do not ("journey-independent" place fields). To investigate how these signals inform learning and memory for new and familiar episodes, we recorded CA1 and CA3 activity in rats trained to perform a "standard" spatial memory task in a plus maze and in two new task variants. A "switch" task exchanged the start and goal locations in the same environment; an "altered environment" task contained unfamiliar local and distal cues. In the switch task, performance was mildly impaired, new firing maps were stable, but the proportion and stability of journey-dependent place fields declined. In the altered environment, overall performance was strongly impaired, new firing maps were unstable, and stable proportions of journey-dependent place fields were maintained. In both tasks, memory errors were accompanied by a decline in journey codes. The different dynamics of place and journey coding suggest that they reflect separate mechanisms and contribute to distinct memory computations. Stable place fields may represent familiar relationships among environmental features that are required for consistent memory performance. Journey-dependent activity may correspond with goal-directed behavioral sequences that reflect expectancies that generalize across environments. The complementary signals could help link current events with established memories, so that familiarity with either a behavioral strategy or an environment can inform goal-directed learning.

  9. The use of the codes from MCU family for calculations of WWER type reactors

    International Nuclear Information System (INIS)

    Abagijan, L.P.; Alexeyev, N.I.; Bryzgalov, V.I.; Gomin, E.A.; Glushkov, A.E.; Gorodkov, S.S.; Gurevich, M.I.; Kalugin, M.A.; Marin, S.V.; Shkarovsky, D.A.; Yudkevich, M.S.

    2000-01-01

    The MCU-RFFI/A and MCU-REA codes developed within the framework of the long term MCU project are widely used for calculations of neutron physic characteristics of WWER type reactors. Complete descriptions of the codes are available in both Russian and English. The codes are verified and validated by means of the comparison of calculated results with experimental data and mathematical benchmarks. The codes are licensed by Russian Nuclear and Criticality Safety Regulatory Body (Gosatomnadzor RF) (Code Passports: N 61 of 17.10.1966 and N 115 of 02.03.2000 accordingly)). The report gives examples of WWER reactor physic tasks important for practice solved using the codes from the MCU family. Some calculational results are given too. (Authors)

  10. Compression of seismic data: filter banks and extended transforms, synthesis and adaptation; Compression de donnees sismiques: bancs de filtres et transformees etendues, synthese et adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Duval, L.

    2000-11-01

    Wavelet and wavelet packet transforms are the most commonly used algorithms for seismic data compression. Wavelet coefficients are generally quantized and encoded by classical entropy coding techniques. We first propose in this work a compression algorithm based on the wavelet transform. The wavelet transform is used together with a zero-tree type coding, with first use in seismic applications. Classical wavelet transforms nevertheless yield a quite rigid approach, since it is often desirable to adapt the transform stage to the properties of each type of signal. We thus propose a second algorithm using, instead of wavelets, a set of so called 'extended transforms'. These transforms, originating from the filter bank theory, are parameterized. Classical examples are Malvar's Lapped Orthogonal Transforms (LOT) or de Queiroz et al. Generalized Lapped Orthogonal Transforms (GenLOT). We propose several optimization criteria to build 'extended transforms' which are adapted the properties of seismic signals. We further show that these transforms can be used with the same zero-tree type coding technique as used with wavelets. Both proposed algorithms provide exact compression rate choice, block-wise compression (in the case of extended transforms) and partial decompression for quality control or visualization. Performances are tested on a set of actual seismic data. They are evaluated for several quality measures. We also compare them to other seismic compression algorithms. (author)

  11. FY16 ASME High Temperature Code Activities

    Energy Technology Data Exchange (ETDEWEB)

    Swindeman, M. J. [Chromtech Inc., Oak Ridge, TN (United States); Jetter, R. I. [R. I Jetter Consulting, Pebble Beach, CA (United States); Sham, T. -L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-01

    One of the objectives of the ASME high temperature Code activities is to develop and validate both improvements and the basic features of Section III, Division 5, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to be used to assess whether or not a specific component under specified loading conditions will satisfy the elevated temperature design requirements for Class A components in Section III, Division 5, Subsection HB, Subpart B (HBB). There are many features and alternative paths of varying complexity in HBB. The initial focus of this task is a basic path through the various options for a single reference material, 316H stainless steel. However, the program will be structured for eventual incorporation all the features and permitted materials of HBB. Since this task has recently been initiated, this report focuses on the description of the initial path forward and an overall description of the approach to computer program development.

  12. Transport code and nuclear data in intermediate energy region

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Akira; Odama, Naomitsu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Maekawa, F.; Ueki, K.; Kosaka, K.; Oyama, Y.

    1998-11-01

    We briefly reviewed the problems of intermediate energy nuclear data file and transport codes in connection with processing of the data. This is a summary of our group in the task force on JENDL High Energy File Integral Evaluation (JHEFIE). In this article we stress the necessity of the production of intermediate evaluated nuclear data file up to 3 GeV for the application of accelerator driven transmutation (ADT) system. And also we state the necessity of having our own transport code system to calculate the radiation fields using these evaluated files from the strategic points of view to keep our development of the ADT technology completely free from other conditions outside of our own such as imported codes and data with poor maintenance or unknown accuracy. (author)

  13. Transport code and nuclear data in intermediate energy region

    International Nuclear Information System (INIS)

    Hasegawa, Akira; Odama, Naomitsu; Maekawa, F.; Ueki, K.; Kosaka, K.; Oyama, Y.

    1998-01-01

    We briefly reviewed the problems of intermediate energy nuclear data file and transport codes in connection with processing of the data. This is a summary of our group in the task force on JENDL High Energy File Integral Evaluation (JHEFIE). In this article we stress the necessity of the production of intermediate evaluated nuclear data file up to 3 GeV for the application of accelerator driven transmutation (ADT) system. And also we state the necessity of having our own transport code system to calculate the radiation fields using these evaluated files from the strategic points of view to keep our development of the ADT technology completely free from other conditions outside of our own such as imported codes and data with poor maintenance or unknown accuracy. (author)

  14. Partitioning the Metabolic Cost of Human Running: A Task-by-Task Approach

    Science.gov (United States)

    Arellano, Christopher J.; Kram, Rodger

    2014-01-01

    Compared with other species, humans can be very tractable and thus an ideal “model system” for investigating the metabolic cost of locomotion. Here, we review the biomechanical basis for the metabolic cost of running. Running has been historically modeled as a simple spring-mass system whereby the leg acts as a linear spring, storing, and returning elastic potential energy during stance. However, if running can be modeled as a simple spring-mass system with the underlying assumption of perfect elastic energy storage and return, why does running incur a metabolic cost at all? In 1980, Taylor et al. proposed the “cost of generating force” hypothesis, which was based on the idea that elastic structures allow the muscles to transform metabolic energy into force, and not necessarily mechanical work. In 1990, Kram and Taylor then provided a more explicit and quantitative explanation by demonstrating that the rate of metabolic energy consumption is proportional to body weight and inversely proportional to the time of foot-ground contact for a variety of animals ranging in size and running speed. With a focus on humans, Kram and his colleagues then adopted a task-by-task approach and initially found that the metabolic cost of running could be “individually” partitioned into body weight support (74%), propulsion (37%), and leg-swing (20%). Summing all these biomechanical tasks leads to a paradoxical overestimation of 131%. To further elucidate the possible interactions between these tasks, later studies quantified the reductions in metabolic cost in response to synergistic combinations of body weight support, aiding horizontal forces, and leg-swing-assist forces. This synergistic approach revealed that the interactive nature of body weight support and forward propulsion comprises ∼80% of the net metabolic cost of running. The task of leg-swing at most comprises ∼7% of the net metabolic cost of running and is independent of body weight support and forward

  15. Discrete cosine and sine transforms general properties, fast algorithms and integer approximations

    CERN Document Server

    Britanak, Vladimir; Rao, K R; Rao, K R

    2006-01-01

    The Discrete Cosine Transform (DCT) is used in many applications by the scientific, engineering and research communities and in data compression in particular. Fast algorithms and applications of the DCT Type II (DCT-II) have become the heart of many established international image/video coding standards. Since then other forms of the DCT and Discrete Sine Transform (DST) have been investigated in detail. This new edition presents the complete set of DCT and DST discrete trigonometric transforms, including their definitions, general mathematical properties, and relations to the optimal Karhune

  16. Sustainability and transformation plans: translating the perspectives.

    Science.gov (United States)

    Thakrar, Sonali V; Bell, Diane

    2017-10-02

    Each local health economy has been tasked with producing a sustainability and transformation plan. A health economy is a system that controls and contributes to health-care resource and the effects of health services on its population. This includes commissioners, acute providers, primary care providers, community services, public health and the voluntary sector. Sustainability and transformation plans represent a shift in the way health care is planned for in England. The aim of each sustainability and transformation plan is to deliver care within existing resource limits by improving quality of care, developing new models of care and improving efficiency of care provision. The tight timescales for production of sustainability and transformation plans mean that in most cases there has been limited clinical engagement; as a result many clinicians have limited sight, understanding or ownership of the proposals within sustainability and transformation plans. As sustainability and transformation plans move into the implementation phase, this article explores the role of the clinician in the ongoing design and delivery of the local sustainability and transformation plans. By finding the common ground between the perspectives of the clinician, the commissioner and system leaders, the motivation of clinicians can be aligned with the ambitions of the sustainability and transformation plan. The common goal of a sustainability and transformation plan and the necessary collaboration required to make it successful is discussed. Ultimately, such translation is essential: clinicians are intelligent, adaptive and motivated individuals who must have a lead role in constructing and implementing plans that transform health and social care.

  17. Neural coding in graphs of bidirectional associative memories.

    Science.gov (United States)

    Bouchain, A David; Palm, Günther

    2012-01-24

    In the last years we have developed large neural network models for the realization of complex cognitive tasks in a neural network architecture that resembles the network of the cerebral cortex. We have used networks of several cortical modules that contain two populations of neurons (one excitatory, one inhibitory). The excitatory populations in these so-called "cortical networks" are organized as a graph of Bidirectional Associative Memories (BAMs), where edges of the graph correspond to BAMs connecting two neural modules and nodes of the graph correspond to excitatory populations with associative feedback connections (and inhibitory interneurons). The neural code in each of these modules consists essentially of the firing pattern of the excitatory population, where mainly it is the subset of active neurons that codes the contents to be represented. The overall activity can be used to distinguish different properties of the patterns that are represented which we need to distinguish and control when performing complex tasks like language understanding with these cortical networks. The most important pattern properties or situations are: exactly fitting or matching input, incomplete information or partially matching pattern, superposition of several patterns, conflicting information, and new information that is to be learned. We show simple simulations of these situations in one area or module and discuss how to distinguish these situations based on the overall internal activation of the module. This article is part of a Special Issue entitled "Neural Coding". Copyright © 2011 Elsevier B.V. All rights reserved.

  18. SPI Project Criticality Task Force initial review and assessment

    International Nuclear Information System (INIS)

    McKinley, K.B.; Cannon, J.W.; Marsden, R.S.; Worle, H.A.

    1980-03-01

    The Slagging Pyrolysis Incinerator (SPI) Facility is being developed to process transuranic waste stored and buried at the Idaho National Engineering Laboratory (INEL) into a chemically inert, physically stable, basalt-like residue acceptable for a Federal Repository. A task force was established by the SPI Project Division to review and assess all aspects of criticality safety for the SPI Facility. This document presents the initial review, evaluations, and recommendations of the task force and includes the following: background information on waste characterization, and criticality control approaches and philosophies, a description of the SPI Facility Waste Processing Building, a review and assessment of potentially relevant codes and regulations; a review and assessment of the present state of criticality and assaying/monitoring studies, and recommendations for changes in and additions to these studies. The review and assessment of potentially relevant codes and regulations indicate that ERDAM 0530, Nuclear Criticality Safety should be the controlling document for criticality safety for the SPI Project. In general, the criticality control approaches and philosophies for the SPI Project comply with this document

  19. Entangled cloning of stabilizer codes and free fermions

    Science.gov (United States)

    Hsieh, Timothy H.

    2016-10-01

    Though the no-cloning theorem [Wooters and Zurek, Nature (London) 299, 802 (1982), 10.1038/299802a0] prohibits exact replication of arbitrary quantum states, there are many instances in quantum information processing and entanglement measurement in which a weaker form of cloning may be useful. Here, I provide a construction for generating an "entangled clone" for a particular but rather expansive and rich class of states. Given a stabilizer code or free fermion Hamiltonian, this construction generates an exact entangled clone of the original ground state, in the sense that the entanglement between the original and the exact copy can be tuned to be arbitrarily small but finite, or large, and the relation between the original and the copy can also be modified to some extent. For example, this Rapid Communication focuses on generating time-reversed copies of stabilizer codes and particle-hole transformed ground states of free fermion systems, although untransformed clones can also be generated. The protocol leverages entanglement to simulate a transformed copy of the Hamiltonian without having to physically implement it and can potentially be realized in superconducting qubits or ultracold atomic systems.

  20. Transformer core modeling for magnetizing inrush current investigation

    Directory of Open Access Journals (Sweden)

    A.Yahiou

    2014-03-01

    Full Text Available The inrush currents generated during an energization of power transformer can reach very high values and may cause many problems in power system. This magnetizing inrush current which occurs at the time of energization of a transformer is due to temporary overfluxing in the transformer core. Its magnitude mainly depends on switching parameters such as the resistance of the primary winding and the point-on-voltage wave (switching angle. This paper describes a system for measuring the inrush current which is composed principally of an acquisition card (EAGLE, and LabVIEW code. The system is also capable of presetting various combinations of switching parameters for the energization of a 2 kVA transformer via an electronic card. Moreover, an algorithm for calculating the saturation curve is presented taking the iron core reactive losses into account, thereby producing a nonlinear inductance. This curve is used to simulate the magnetizing inrush current using the ATP-EMTP software.

  1. Merits and difficulties in adopting codes, standards and nuclear regulations

    International Nuclear Information System (INIS)

    El-Saiedi, A.F.; Morsy, S.; Mariy, A.

    1978-01-01

    Developing countries planning for introducing nuclear power plants as a source of energy have to develop or adopt sound regulatory practices. These are necessary to help governmental authorities to assess the safety of nuclear power plants and to perform inspections needed to confirm the established safe and sound limits. The first requirement is to form an independent regulatory body capable of setting up and enforcing proper safety regulations. The formation of this body is governed by several considerations related to local conditions in the developing countries, which may not always be favourable. It is quite impractical for countries with limited experience in the nuclear power field to develop their own codes, standards and regulations required for the nuclear regulatory body to perform its tasks. A practical way is to adopt codes, standards and regulations of a well-developed country. This has merits as well as drawbacks. The latter are related to problems of personnel, software, equipment and facilities. The difficulties involved in forming a nuclear regulatory body, and the merits and difficulties in adopting foreign codes, standards and regulations required for such body to perform its tasks, are discussed in this paper. Discussions are applicable to many developing countries and particular emphasis is given to the conditions and practices in Egypt. (author)

  2. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  3. IPv4 to IPv6 Transformation Schemes

    Science.gov (United States)

    Miyakawa, Shin

    According to the recent observations of IPv4 (Internet Protocol version 4) address allocation status, it will be running out within few years. Consequently, to ensure the continuous extension of the Internet operation, introducing IPv6 (Internet Protocol version 6) protocol is surely needed. But at the same time, such transformation must be “smooth” for every Internet users and be compatible with today's IPv4 based practices. This paper describes several techniques and usage scenario which are discussed mainly in the IETF — Internet Engineering Task Force — and tried to be implemented as prototype products to transform today's Internet towards the IPv6 based one.

  4. Compression and channel-coding algorithms for high-definition television signals

    Science.gov (United States)

    Alparone, Luciano; Benelli, Giuliano; Fabbri, A. F.

    1990-09-01

    In this paper results of investigations about the effects of channel errors in the transmission of images compressed by means of techniques based on Discrete Cosine Transform (DOT) and Vector Quantization (VQ) are presented. Since compressed images are heavily degraded by noise in the transmission channel more seriously for what concern VQ-coded images theoretical studies and simulations are presented in order to define and evaluate this degradation. Some channel coding schemes are proposed in order to protect information during transmission. Hamming codes (7 (15 and (31 have been used for DCT-compressed images more powerful codes such as Golay (23 for VQ-compressed images. Performances attainable with softdecoding techniques are also evaluated better quality images have been obtained than using classical hard decoding techniques. All tests have been carried out to simulate the transmission of a digital image from HDTV signal over an AWGN channel with P5K modulation.

  5. Transformations. I. The Effect of DAF on Sentence Generation

    Science.gov (United States)

    Salter, David

    1976-01-01

    A hypothesis based on the psycholinguistic derivation of sentences was tested. The task required that sentences temporarily stored in memory be transformed and spoken with delayed auditory feedback. Available from Plenum Publishing Corp., 227 W. 17th St., New York, NY 10011. (Author/RM)

  6. Visual perception of complex shape-transforming processes.

    Science.gov (United States)

    Schmidt, Filipp; Fleming, Roland W

    2016-11-01

    Morphogenesis-or the origin of complex natural form-has long fascinated researchers from practically every branch of science. However, we know practically nothing about how we perceive and understand such processes. Here, we measured how observers visually infer shape-transforming processes. Participants viewed pairs of objects ('before' and 'after' a transformation) and identified points that corresponded across the transformation. This allowed us to map out in spatial detail how perceived shape and space were affected by the transformations. Participants' responses were strikingly accurate and mutually consistent for a wide range of non-rigid transformations including complex growth-like processes. A zero-free-parameter model based on matching and interpolating/extrapolating the positions of high-salience contour features predicts the data surprisingly well, suggesting observers infer spatial correspondences relative to key landmarks. Together, our findings reveal the operation of specific perceptual organization processes that make us remarkably adept at identifying correspondences across complex shape-transforming processes by using salient object features. We suggest that these abilities, which allow us to parse and interpret the causally significant features of shapes, are invaluable for many tasks that involve 'making sense' of shape. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Finite element simulation of piezoelectric transformers.

    Science.gov (United States)

    Tsuchiya, T; Kagawa, Y; Wakatsuki, N; Okamura, H

    2001-07-01

    Piezoelectric transformers are nothing but ultrasonic resonators with two pairs of electrodes provided on the surface of a piezoelectric substrate in which electrical energy is carried in the mechanical form. The input and output electrodes are arranged to provide the impedance transformation, which results in the voltage transformation. As they are operated at a resonance, the electrical equivalent circuit approach has traditionally been developed in a rather empirical way and has been used for analysis and design. The present paper deals with the analysis of the piezoelectric transformers based on the three-dimensional finite element modelling. The PIEZO3D code that we have developed is modified to include the external loading conditions. The finite element approach is now available for a wide variety of the electrical boundary conditions. The equivalent circuit of lumped parameters can also be derived from the finite element method (FEM) solution if required. The simulation of the present transformers is made for the low intensity operation and compared with the experimental results. Demonstration is made for basic Rosen-type transformers in which the longitudinal mode of a plate plays an important role; in which the equivalent circuit of lumped constants has been used. However, there are many modes of vibration associated with the plate, the effect of which cannot always be ignored. In the experiment, the double resonances are sometimes observed in the vicinity of the operating frequency. The simulation demonstrates that this is due to the coupling of the longitudinal mode with the flexural mode. Thus, the simulation provides an invaluable guideline to the transformer design.

  8. Machine-Checked Sequencer for Critical Embedded Code Generator

    Science.gov (United States)

    Izerrouken, Nassima; Pantel, Marc; Thirioux, Xavier

    This paper presents the development of a correct-by-construction block sequencer for GeneAuto a qualifiable (according to DO178B/ED12B recommendation) automatic code generator. It transforms Simulink models to MISRA C code for safety critical systems. Our approach which combines classical development process and formal specification and verification using proof-assistants, led to preliminary fruitful exchanges with certification authorities. We present parts of the classical user and tools requirements and derived formal specifications, implementation and verification for the correctness and termination of the block sequencer. This sequencer has been successfully applied to real-size industrial use cases from various transportation domain partners and led to requirement errors detection and a correct-by-construction implementation.

  9. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  10. Full-frame compression of discrete wavelet and cosine transforms

    Science.gov (United States)

    Lo, Shih-Chung B.; Li, Huai; Krasner, Brian; Freedman, Matthew T.; Mun, Seong K.

    1995-04-01

    At the foreground of computerized radiology and the filmless hospital are the possibilities for easy image retrieval, efficient storage, and rapid image communication. This paper represents the authors' continuous efforts in compression research on full-frame discrete wavelet (FFDWT) and full-frame discrete cosine transforms (FFDCT) for medical image compression. Prior to the coding, it is important to evaluate the global entropy in the decomposed space. It is because of the minimum entropy, that a maximum compression efficiency can be achieved. In this study, each image was split into the top three most significant bit (MSB) and the remaining remapped least significant bit (RLSB) images. The 3MSB image was compressed by an error-free contour coding and received an average of 0.1 bit/pixel. The RLSB image was either transformed to a multi-channel wavelet or the cosine transform domain for entropy evaluation. Ten x-ray chest radiographs and ten mammograms were randomly selected from our clinical database and were used for the study. Our results indicated that the coding scheme in the FFDCT domain performed better than in FFDWT domain for high-resolution digital chest radiographs and mammograms. From this study, we found that decomposition efficiency in the DCT domain for relatively smooth images is higher than that in the DWT. However, both schemes worked just as well for low resolution digital images. We also found that the image characteristics of the `Lena' image commonly used in the compression literature are very different from those of radiological images. The compression outcome of the radiological images can not be extrapolated from the compression result based on the `Lena.'

  11. Comparisons of coded aperture imaging using various apertures and decoding methods

    International Nuclear Information System (INIS)

    Chang, L.T.; Macdonald, B.; Perez-Mendez, V.

    1976-07-01

    The utility of coded aperture γ camera imaging of radioisotope distributions in Nuclear Medicine is in its ability to give depth information about a three dimensional source. We have calculated imaging with Fresnel zone plate and multiple pinhole apertures to produce coded shadows and reconstruction of these shadows using correlation, Fresnel diffraction, and Fourier transform deconvolution. Comparisons of the coded apertures and decoding methods are made by evaluating their point response functions both for in-focus and out-of-focus image planes. Background averages and standard deviations were calculated. In some cases, background subtraction was made using combinations of two complementary apertures. Results using deconvolution reconstruction for finite numbers of events are also given

  12. IEA Wind Task 23 Offshore Wind Technology and Deployment. Subtask 1 Experience with Critical Deployment Issues. Final Technical Report

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard

    The final report for IEA Wind Task 23, Offshore Wind Energy Technology and Deployment, is made up of two separate reports: Subtask 1: Experience with Critical Deployment Issues and Subtask 2: Offshore Code Comparison Collaborative (OC3). The Subtask 1 report included here provides background...... information and objectives of Task 23. It specifically discusses ecological issues and regulation, electrical system integration and offshore wind, external conditions, and key conclusions for Subtask 1. The Subtask 2 report covers OC3 background information and objectives of the task, OC3 benchmark exercises...... of aero-elastic offshore wind turbine codes, monopile foundation modeling, tripod support structure modeling, and Phase IV results regarding floating wind turbine modeling....

  13. Verification of aero-elastic offshore wind turbine design codes under IEA Wind Task XXIII

    DEFF Research Database (Denmark)

    Vorpahl, Fabian; Strobel, Michael; Jonkman, Jason M.

    2014-01-01

    with the incident waves, sea current, hydrodynamics and foundation dynamics of the support structure. A large set of time series simulation results such as turbine operational characteristics, external conditions, and load and displacement outputs was compared and interpreted. Load cases were defined and run...... to differences in the model fidelity, aerodynamic implementation, hydrodynamic load discretization and numerical difficulties within the codes. The comparisons resulted in a more thorough understanding of the modeling techniques and better knowledge of when various approximations are not valid.More importantly...... is to summarize the lessons learned and present results that code developers can compare to. The set of benchmark load cases defined and simulated during the course of this project—the raw data for this paper—is available to the offshore wind turbine simulation community and is already being used for testing...

  14. Los Alamos and Lawrence Livermore National Laboratories Code-to-Code Comparison of Inter Lab Test Problem 1 for Asteroid Impact Hazard Mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Robert P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Howley, Kirsten [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferguson, Jim Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gisler, Galen Ross [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Plesko, Catherine Suzanne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Managan, Rob [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Owen, Mike [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Wasem, Joseph [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bruck-Syal, Megan [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-15

    The NNSA Laboratories have entered into an interagency collaboration with the National Aeronautics and Space Administration (NASA) to explore strategies for prevention of Earth impacts by asteroids. Assessment of such strategies relies upon use of sophisticated multi-physics simulation codes. This document describes the task of verifying and cross-validating, between Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), modeling capabilities and methods to be employed as part of the NNSA-NASA collaboration. The approach has been to develop a set of test problems and then to compare and contrast results obtained by use of a suite of codes, including MCNP, RAGE, Mercury, Ares, and Spheral. This document provides a short description of the codes, an overview of the idealized test problems, and discussion of the results for deflection by kinetic impactors and stand-off nuclear explosions.

  15. Prodeto, a computer code for probabilistic fatigue design

    Energy Technology Data Exchange (ETDEWEB)

    Braam, H [ECN-Solar and Wind Energy, Petten (Netherlands); Christensen, C J; Thoegersen, M L [Risoe National Lab., Roskilde (Denmark); Ronold, K O [Det Norske Veritas, Hoevik (Norway)

    1999-03-01

    A computer code for structural relibility analyses of wind turbine rotor blades subjected to fatigue loading is presented. With pre-processors that can transform measured and theoretically predicted load series to load range distributions by rain-flow counting and with a family of generic distribution models for parametric representation of these distribution this computer program is available for carying through probabilistic fatigue analyses of rotor blades. (au)

  16. Operation REDWING Commander Task Group 7.3, Operation Plan Number 1-56.

    Science.gov (United States)

    1956-01-24

    Radio Circuit Description SI. Tab B - Radio Frequency Plan 7:: Tab C - Aircraft Radio Frequency List Tab D - Radio Circuit Diagram Appendix 2 - Radio...Frequency Plan C Aircraft Radio Frequency List D Radio Circuit Diagrams 2 - Radio Call Signs and Code ’Words N. ROHI5BE2 E LCDR, U. S. Navy Flag...appendix. TABS: A Radio Circuit Description B Radio Frequency Plan C Aircraft Radio Frequency List D Radio Circuit Diagrams Joint Task Force SEVEN Task

  17. Do you write secure code?

    CERN Multimedia

    Computer Security Team

    2011-01-01

    At CERN, we are excellent at producing software, such as complex analysis jobs, sophisticated control programs, extensive monitoring tools, interactive web applications, etc. This software is usually highly functional, and fulfils the needs and requirements as defined by its author. However, due to time constraints or unintentional ignorance, security aspects are often neglected. Subsequently, it was even more embarrassing for the author to find out that his code flawed and was used to break into CERN computers, web pages or to steal data…   Thus, if you have the pleasure or task of producing software applications, take some time before and familiarize yourself with good programming practices. They should not only prevent basic security flaws in your code, but also improve its readability, maintainability and efficiency. Basic rules for good programming, as well as essential books on proper software development, can be found in the section for software developers on our security we...

  18. Processing moldable tasks on the grid: Late job binding with lightweight user-level overlay

    CERN Document Server

    Moscicki, J T; Sloot, P M A; Lamanna, M

    2011-01-01

    Independent observations and everyday user experience indicate that performance and reliability of large grid infrastructures may suffer from large and unpredictable variations. In this paper we study the impact of the job queuing time on processing of moldable tasks which are commonly found in large-scale production grids. We use the mean value and variance of makespan as the quality of service indicators. We develop a general task processing model to provide a quantitative comparison between two models: early and late job binding in a user-level overlay applied to the EGEE Grid infrastructure. We find that the late-binding model effectively defines a transformation of the distribution of makespan according to the Central Limit Theorem. As demonstrated by Monte Carlo simulations using real job traces, this transformation allows to substantially reduce the mean value and variance of makespan. For certain classes of applications task granularity may be adjusted such that a speedup of an order of magnitude or m...

  19. Criticality qualification of a new Monte Carlo code for reactor core analysis

    International Nuclear Information System (INIS)

    Catsaros, N.; Gaveau, B.; Jaekel, M.; Maillard, J.; Maurel, G.; Savva, P.; Silva, J.; Varvayanni, M.; Zisis, Th.

    2009-01-01

    In order to accurately simulate Accelerator Driven Systems (ADS), the utilization of at least two computational tools is necessary (the thermal-hydraulic problem is not considered in the frame of this work), namely: (a) A High Energy Physics (HEP) code system dealing with the 'Accelerator part' of the installation, i.e. the computation of the spectrum, intensity and spatial distribution of the neutrons source created by (p, n) reactions of a proton beam on a target and (b) a neutronics code system, handling the 'Reactor part' of the installation, i.e. criticality calculations, neutron transport, fuel burn-up and fission products evolution. In the present work, a single computational tool, aiming to analyze an ADS in its integrity and also able to perform core analysis for a conventional fission reactor, is proposed. The code is based on the well qualified HEP code GEANT (version 3), transformed to perform criticality calculations. The performance of the code is tested against two qualified neutronics code systems, the diffusion/transport SCALE-CITATION code system and the Monte Carlo TRIPOLI code, in the case of a research reactor core analysis. A satisfactory agreement was exhibited by the three codes.

  20. Emotion regulation and conflict transformation in multi-team systems

    NARCIS (Netherlands)

    Curseu, P.L.; Meeus, M.T.H.

    2014-01-01

    Purpose The aim of this paper is to test the moderating role of emotion regulation in the transformation of both task and process conflict into relationship conflict. Design/methodology/approach A field study of multi-teams systems, in which (94) respondents are engaged in interpersonal and

  1. Fiber Bragg grating for spectral phase optical code-division multiple-access encoding and decoding

    Science.gov (United States)

    Fang, Xiaohui; Wang, Dong-Ning; Li, Shichen

    2003-08-01

    A new method for realizing spectral phase optical code-division multiple-access (OCDMA) coding based on step chirped fiber Bragg gratings (SCFBGs) is proposed and the corresponding encoder/decoder is presented. With this method, a mapping code is introduced for the m-sequence address code and the phase shift can be inserted into the subgratings of the SCFBG according to the mapping code. The transfer matrix method together with Fourier transform is used to investigate the characteristics of the encoder/decoder. The factors that influence the correlation property of the encoder/decoder, including index modulation and bandwidth of the subgrating, are identified. The system structure is simple and good correlation output can be obtained. The performance of the OCDMA system based on SCFBGs has been analyzed.

  2. Time-domain modeling of electromagnetic diffusion with a frequency-domain code

    NARCIS (Netherlands)

    Mulder, W.A.; Wirianto, M.; Slob, E.C.

    2007-01-01

    We modeled time-domain EM measurements of induction currents for marine and land applications with a frequency-domain code. An analysis of the computational complexity of a number of numerical methods shows that frequency-domain modeling followed by a Fourier transform is an attractive choice if a

  3. Quantum decoration transformation for spin models

    Energy Technology Data Exchange (ETDEWEB)

    Braz, F.F.; Rodrigues, F.C.; Souza, S.M. de; Rojas, Onofre, E-mail: ors@dfi.ufla.br

    2016-09-15

    It is quite relevant the extension of decoration transformation for quantum spin models since most of the real materials could be well described by Heisenberg type models. Here we propose an exact quantum decoration transformation and also showing interesting properties such as the persistence of symmetry and the symmetry breaking during this transformation. Although the proposed transformation, in principle, cannot be used to map exactly a quantum spin lattice model into another quantum spin lattice model, since the operators are non-commutative. However, it is possible the mapping in the “classical” limit, establishing an equivalence between both quantum spin lattice models. To study the validity of this approach for quantum spin lattice model, we use the Zassenhaus formula, and we verify how the correction could influence the decoration transformation. But this correction could be useless to improve the quantum decoration transformation because it involves the second-nearest-neighbor and further nearest neighbor couplings, which leads into a cumbersome task to establish the equivalence between both lattice models. This correction also gives us valuable information about its contribution, for most of the Heisenberg type models, this correction could be irrelevant at least up to the third order term of Zassenhaus formula. This transformation is applied to a finite size Heisenberg chain, comparing with the exact numerical results, our result is consistent for weak xy-anisotropy coupling. We also apply to bond-alternating Ising–Heisenberg chain model, obtaining an accurate result in the limit of the quasi-Ising chain.

  4. Quantum decoration transformation for spin models

    International Nuclear Information System (INIS)

    Braz, F.F.; Rodrigues, F.C.; Souza, S.M. de; Rojas, Onofre

    2016-01-01

    It is quite relevant the extension of decoration transformation for quantum spin models since most of the real materials could be well described by Heisenberg type models. Here we propose an exact quantum decoration transformation and also showing interesting properties such as the persistence of symmetry and the symmetry breaking during this transformation. Although the proposed transformation, in principle, cannot be used to map exactly a quantum spin lattice model into another quantum spin lattice model, since the operators are non-commutative. However, it is possible the mapping in the “classical” limit, establishing an equivalence between both quantum spin lattice models. To study the validity of this approach for quantum spin lattice model, we use the Zassenhaus formula, and we verify how the correction could influence the decoration transformation. But this correction could be useless to improve the quantum decoration transformation because it involves the second-nearest-neighbor and further nearest neighbor couplings, which leads into a cumbersome task to establish the equivalence between both lattice models. This correction also gives us valuable information about its contribution, for most of the Heisenberg type models, this correction could be irrelevant at least up to the third order term of Zassenhaus formula. This transformation is applied to a finite size Heisenberg chain, comparing with the exact numerical results, our result is consistent for weak xy-anisotropy coupling. We also apply to bond-alternating Ising–Heisenberg chain model, obtaining an accurate result in the limit of the quasi-Ising chain.

  5. Program Transformation to Identify List-Based Parallel Skeletons

    Directory of Open Access Journals (Sweden)

    Venkatesh Kannan

    2016-07-01

    Full Text Available Algorithmic skeletons are used as building-blocks to ease the task of parallel programming by abstracting the details of parallel implementation from the developer. Most existing libraries provide implementations of skeletons that are defined over flat data types such as lists or arrays. However, skeleton-based parallel programming is still very challenging as it requires intricate analysis of the underlying algorithm and often uses inefficient intermediate data structures. Further, the algorithmic structure of a given program may not match those of list-based skeletons. In this paper, we present a method to automatically transform any given program to one that is defined over a list and is more likely to contain instances of list-based skeletons. This facilitates the parallel execution of a transformed program using existing implementations of list-based parallel skeletons. Further, by using an existing transformation called distillation in conjunction with our method, we produce transformed programs that contain fewer inefficient intermediate data structures.

  6. Trading speed and accuracy by coding time: a coupled-circuit cortical model.

    Directory of Open Access Journals (Sweden)

    Dominic Standage

    2013-04-01

    Full Text Available Our actions take place in space and time, but despite the role of time in decision theory and the growing acknowledgement that the encoding of time is crucial to behaviour, few studies have considered the interactions between neural codes for objects in space and for elapsed time during perceptual decisions. The speed-accuracy trade-off (SAT provides a window into spatiotemporal interactions. Our hypothesis is that temporal coding determines the rate at which spatial evidence is integrated, controlling the SAT by gain modulation. Here, we propose that local cortical circuits are inherently suited to the relevant spatial and temporal coding. In simulations of an interval estimation task, we use a generic local-circuit model to encode time by 'climbing' activity, seen in cortex during tasks with a timing requirement. The model is a network of simulated pyramidal cells and inhibitory interneurons, connected by conductance synapses. A simple learning rule enables the network to quickly produce new interval estimates, which show signature characteristics of estimates by experimental subjects. Analysis of network dynamics formally characterizes this generic, local-circuit timing mechanism. In simulations of a perceptual decision task, we couple two such networks. Network function is determined only by spatial selectivity and NMDA receptor conductance strength; all other parameters are identical. To trade speed and accuracy, the timing network simply learns longer or shorter intervals, driving the rate of downstream decision processing by spatially non-selective input, an established form of gain modulation. Like the timing network's interval estimates, decision times show signature characteristics of those by experimental subjects. Overall, we propose, demonstrate and analyse a generic mechanism for timing, a generic mechanism for modulation of decision processing by temporal codes, and we make predictions for experimental verification.

  7. Depth distribution analysis of martensitic transformations in Xe implanted austenitic stainless steel

    International Nuclear Information System (INIS)

    Johnson, E.; Johansen, A.; Sarholt-Kristensen, L.; Chechenin, N.G.; Grabaek, L.; Bohr, J.

    1988-01-01

    In this work we present results from a depth distribution analysis of the martensitic phase change occurring in Xe implanted single crystals of austenitic stainless steel. Analysis was done by 'in situ' RBS/channeling analysis, X-ray diffraction and cross-section transmission electron microscopy (XTEM) of the implanted surface. It is found that the martensitic transformation of the surface layer occurs for fluences above 1x10 20 m -2 . The thickness of the transformed layer increases with fluence to ≅ 150 nm at 1x10 21 m -2 , which far exceeds the range plus straggling of the implanted Xe as calculated by the TRIM computer simulation code. Simulations using the MARLOWE code indicate that the thickness of the transformed layer coincides with the range of the small fraction of ions channeled under random implantation conditions. Using cross sectional TEM on the Xe implanted crystals, the depth distribution of gas inclusions and defects can be directly observed. Using X-ray diffraction on implanted single crystals, the solid epitaxial nature of the Xe inclusions, induced prior to the martensitic transformation, was established. The lattice constant obtained from the broad diffraction peak indicates that the pressure in the inclusions is ≅ 5 GPa. (orig./BHO)

  8. National Society of Genetic Counselors Code of Ethics: Explication of 2017 Revisions.

    Science.gov (United States)

    Senter, Leigha; Bennett, Robin L; Madeo, Anne C; Noblin, Sarah; Ormond, Kelly E; Schneider, Kami Wolfe; Swan, Kelli; Virani, Alice

    2018-02-01

    The Code of Ethics (COE) of the National Society of Genetic Counselors (NSGC) was adopted in 1992 and was later revised and adopted in 2006. In 2016, the NSGC Code of Ethics Review Task Force (COERTF) was convened to review the COE. The COERTF reviewed ethical codes written by other professional organizations and suggested changes that would better reflect the current and evolving nature of the genetic counseling profession. The COERTF received input from the society's legal counsel, Board of Directors, and members-at-large. A revised COE was proposed to the membership and approved and adopted in April 2017. The revisions and rationale for each are presented.

  9. Design of time-pulse coded optoelectronic neuronal elements for nonlinear transformation and integration

    Science.gov (United States)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2008-03-01

    In the paper the actuality of neurophysiologically motivated neuron arrays with flexibly programmable functions and operations with possibility to select required accuracy and type of nonlinear transformation and learning are shown. We consider neurons design and simulation results of multichannel spatio-time algebraic accumulation - integration of optical signals. Advantages for nonlinear transformation and summation - integration are shown. The offered circuits are simple and can have intellectual properties such as learning and adaptation. The integrator-neuron is based on CMOS current mirrors and comparators. The performance: consumable power - 100...500 μW, signal period- 0.1...1ms, input optical signals power - 0.2...20 μW time delays - less 1μs, the number of optical signals - 2...10, integration time - 10...100 of signal periods, accuracy or integration error - about 1%. Various modifications of the neuron-integrators with improved performance and for different applications are considered in the paper.

  10. Novel Iris Biometric Watermarking Based on Singular Value Decomposition and Discrete Cosine Transform

    Directory of Open Access Journals (Sweden)

    Jinyu Lu

    2014-01-01

    Full Text Available A novel iris biometric watermarking scheme is proposed focusing on iris recognition instead of the traditional watermark for increasing the security of the digital products. The preprocess of iris image is to be done firstly, which generates the iris biometric template from person's eye images. And then the templates are to be on discrete cosine transform; the value of the discrete cosine is encoded to BCH error control coding. The host image is divided into four areas equally correspondingly. The BCH codes are embedded in the singular values of each host image's coefficients which are obtained through discrete cosine transform (DCT. Numerical results reveal that proposed method can extract the watermark effectively and illustrate its security and robustness.

  11. Simulations of linear and Hamming codes using SageMath

    Science.gov (United States)

    Timur, Tahta D.; Adzkiya, Dieky; Soleha

    2018-03-01

    Digital data transmission over a noisy channel could distort the message being transmitted. The goal of coding theory is to ensure data integrity, that is, to find out if and where this noise has distorted the message and what the original message was. Data transmission consists of three stages: encoding, transmission, and decoding. Linear and Hamming codes are codes that we discussed in this work, where encoding algorithms are parity check and generator matrix, and decoding algorithms are nearest neighbor and syndrome. We aim to show that we can simulate these processes using SageMath software, which has built-in class of coding theory in general and linear codes in particular. First we consider the message as a binary vector of size k. This message then will be encoded to a vector with size n using given algorithms. And then a noisy channel with particular value of error probability will be created where the transmission will took place. The last task would be decoding, which will correct and revert the received message back to the original message whenever possible, that is, if the number of error occurred is smaller or equal to the correcting radius of the code. In this paper we will use two types of data for simulations, namely vector and text data.

  12. Parallel iterative decoding of transform domain Wyner-Ziv video using cross bitplane correlation

    DEFF Research Database (Denmark)

    Luong, Huynh Van; Huang, Xin; Forchhammer, Søren

    2011-01-01

    decoding scheme is proposed to improve the coding efficiency of TDWZ video codecs. The proposed parallel iterative LDPC decoding scheme is able to utilize cross bitplane correlation during decoding, by iteratively refining the soft-input, updating a modeled noise distribution and thereafter enhancing......In recent years, Transform Domain Wyner-Ziv (TDWZ) video coding has been proposed as an efficient Distributed Video Coding (DVC) solution, which fully or partly exploits the source statistics at the decoder to reduce the computational burden at the encoder. In this paper, a parallel iterative LDPC...

  13. Developmental Transformations Art Therapy: An Embodied, Interactional Approach

    Science.gov (United States)

    Rosen, Marni; Pitre, Renée; Johnson, David Read

    2016-01-01

    A new method of art therapy is described, based on Developmental Transformations, in which the therapist participates in joint art making with a client. The therapist's task is to present a graduated set of interpersonal demands on the client through the artwork, helping the client find adaptive responses to accommodations required by others, as…

  14. Study plan for the sensitivity analysis of the Terrain-Responsive Atmospheric Code (TRAC)

    International Nuclear Information System (INIS)

    Restrepo, L.F.; Deitesfeld, C.A.

    1987-01-01

    Rocky Flats Plant, Golden, Colorado is presently developing a computer code to model the dispersion of potential or actual releases of radioactive or toxic materials to the environment, along with the public consequences from these releases. The model, the Terrain-Responsive Atmospheric Code (TRAC), considers several complex features which could affect the overall dispersion and consequences. To help validate TRAC, a sensitivity analysis is being planned to determine how sensitive the model's solutions are to input variables. This report contains a brief description of the code, along with a list of tasks and resources needed to complete the sensitivity analysis

  15. The Unified English Braille Code: Examination by Science, Mathematics, and Computer Science Technical Expert Braille Readers

    Science.gov (United States)

    Holbrook, M. Cay; MacCuspie, P. Ann

    2010-01-01

    Braille-reading mathematicians, scientists, and computer scientists were asked to examine the usability of the Unified English Braille Code (UEB) for technical materials. They had little knowledge of the code prior to the study. The research included two reading tasks, a short tutorial about UEB, and a focus group. The results indicated that the…

  16. DYMEL code for prediction of dynamic stability limits in boilers

    International Nuclear Information System (INIS)

    Deam, R.T.

    1980-01-01

    Theoretical and experimental studies of Hydrodynamic Instability in boilers were undertaken to resolve the uncertainties of the existing predictive methods at the time the first Advanced Gas Cooled Reactor (AGR) plant was commissioned. The experiments were conducted on a full scale electrical simulation of an AGR boiler and revealed inadequacies in existing methods. As a result a new computer code called DYMEL was developed based on linearisation and Fourier/Laplace Transformation of the one-dimensional boiler equations in both time and space. Beside giving good agreement with local experimental data, the DYMEL code has since shown agreement with stability data from the plant, sodium heated helical tubes, a gas heated helical tube and an electrically heated U-tube. The code is now used widely within the U.K. (author)

  17. Coding Local and Global Binary Visual Features Extracted From Video Sequences

    Science.gov (United States)

    Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2015-11-01

    Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks, while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the Bag-of-Visual-Word (BoVW) model. Several applications, including for example visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget, while attaining a target level of efficiency. In this paper we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can be conveniently adopted to support the Analyze-Then-Compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the Compress-Then-Analyze (CTA) paradigm. In this paper we experimentally compare ATC and CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: homography estimation and content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with CTA, especially in bandwidth limited scenarios.

  18. The determination of frequency response function of the RSG Gas by laplace transform analysis

    International Nuclear Information System (INIS)

    Tukiran, S.; Surian, P.; Jujuratisbela, U.

    1997-01-01

    The response function of the RSG-GAS reactor system to the reactivity perturbations is necessary to be analyzed due to the interrelation with reliability and safety of reactor operation. the response depends on the power frequency response function H(s), while H(s) depends on the zero power frequency response function Z(s) and dynamic power coefficient of reactivity Kp(s) determination of the frequency response function of the RSG-GAS reactor was done by Fourier transform analysis method. Z(s) was obtained by fourier transform of P(t) and Cj(t) became P(S) and Cj(s) in point kinetic equations. Second order of simpson rule was used for completion of its numerical integration. then. LYMPR (Laplace transform for multipurpose reactor) code was made with fortran 77 computer language in vax 8550 system. the LTMPR code is able to determine the frequency response function and period-reactivity relation of RSG-GAS reactor by rod drop method. Profile of power as rod drop, zero power (without reactivity feedback) was used for determination frequency response of RSG-GAS reactor. The results of calculations are in a good agreement with experiment result, so the LTMPR code can be used for analysis response frequency of the RSG-GAS reactor

  19. Non-Coding RNAs: Multi-Tasking Molecules in the Cell

    Directory of Open Access Journals (Sweden)

    Anita Quintal Gomes

    2013-07-01

    Full Text Available In the last years it has become increasingly clear that the mammalian transcriptome is highly complex and includes a large number of small non-coding RNAs (sncRNAs and long noncoding RNAs (lncRNAs. Here we review the biogenesis pathways of the three classes of sncRNAs, namely short interfering RNAs (siRNAs, microRNAs (miRNAs and PIWI-interacting RNAs (piRNAs. These ncRNAs have been extensively studied and are involved in pathways leading to specific gene silencing and the protection of genomes against virus and transposons, for example. Also, lncRNAs have emerged as pivotal molecules for the transcriptional and post-transcriptional regulation of gene expression which is supported by their tissue-specific expression patterns, subcellular distribution, and developmental regulation. Therefore, we also focus our attention on their role in differentiation and development. SncRNAs and lncRNAs play critical roles in defining DNA methylation patterns, as well as chromatin remodeling thus having a substantial effect in epigenetics. The identification of some overlaps in their biogenesis pathways and functional roles raises the hypothesis that these molecules play concerted functions in vivo, creating complex regulatory networks where cooperation with regulatory proteins is necessary. We also highlighted the implications of biogenesis and gene expression deregulation of sncRNAs and lncRNAs in human diseases like cancer.

  20. PSpectRe: a pseudo-spectral code for (P)reheating

    International Nuclear Information System (INIS)

    Easther, Richard; Finkel, Hal; Roth, Nathaniel

    2010-01-01

    PSpectRe is a C++ program that uses Fourier-space pseudo-spectral methods to evolve interacting scalar fields in an expanding universe. PSpectRe is optimized for the analysis of parametric resonance in the post-inflationary universe and provides an alternative to finite differencing codes, such as Defrost and LatticeEasy. PSpectRe has both second- (Velocity-Verlet) and fourth-order (Runge-Kutta) time integrators. Given the same number of spatial points and/or momentum modes, PSpectRe is not significantly slower than finite differencing codes, despite the need for multiple Fourier transforms at each timestep, and exhibits excellent energy conservation. Further, by computing the post-resonance equation of state, we show that in some circumstances PSpectRe obtains reliable results while using substantially fewer points than a finite differencing code. PSpectRe is designed to be easily extended to other problems in early-universe cosmology, including the generation of gravitational waves during phase transitions and pre-inflationary bubble collisions. Specific applications of this code will be described in future work

  1. Image encryption using the fractional wavelet transform

    International Nuclear Information System (INIS)

    Vilardy, Juan M; Useche, J; Torres, C O; Mattos, L

    2011-01-01

    In this paper a technique for the coding of digital images is developed using Fractional Wavelet Transform (FWT) and random phase masks (RPMs). The digital image to encrypt is transformed with the FWT, after the coefficients resulting from the FWT (Approximation, Details: Horizontal, vertical and diagonal) are multiplied each one by different RPMs (statistically independent) and these latest results is applied an Inverse Wavelet Transform (IWT), obtaining the encrypted digital image. The decryption technique is the same encryption technique in reverse sense. This technique provides immediate advantages security compared to conventional techniques, in this technique the mother wavelet family and fractional orders associated with the FWT are additional keys that make access difficult to information to an unauthorized person (besides the RPMs used), thereby the level of encryption security is extraordinarily increased. In this work the mathematical support for the use of the FWT in the computational algorithm for the encryption is also developed.

  2. Balancing Human-machine Interface (HMI) Design in Complex Supervisory Tasks

    International Nuclear Information System (INIS)

    Ha, Junsu; Kim, Arryum; Jang, Inseok; Seong, Poonghyun

    2013-01-01

    Human performance aspects such as plant performance, personnel task performance, situation awareness, cognitive workload, teamwork, and anthropomorphic/physiological factor are evaluated with the HUPESS. Even though the HUPESS provides evaluation results in each of the performance aspects for the integrated system validation (ISV), additional researches have been needed to develop methods on how to find out design deficiency leading to poor performance and give a solution for design improvement in HMI. The authors have developed a method of HMI design improvement for the monitoring and detection tasks which was named as 'DEMIS (Difficulty Evaluation Method in Information Searching)'. The DEMIS is a HMI evaluation method which bridge poor performance and design improvement. Lessons learned from the existing studies lead to a question about how to optimize the whole HMI design. Human factors principles provide the foundation for guidelines of various codes and standards in designing HMIs. Also in NPPs, a lot of guidelines directly from various codes and standard and derived from various research and development projects are available for designing MCR HMIs. In this study, a balancing principle and relevant two measures for HMI design optimization are proposed to be used in the HMI design of complex supervisory tasks in NPPs. The balancing principle is that a HMI element (e. g., an indicator or a push button) should be designed according to its importance

  3. Temporal motifs reveal collaboration patterns in online task-oriented networks

    Science.gov (United States)

    Xuan, Qi; Fang, Huiting; Fu, Chenbo; Filkov, Vladimir

    2015-05-01

    Real networks feature layers of interactions and complexity. In them, different types of nodes can interact with each other via a variety of events. Examples of this complexity are task-oriented social networks (TOSNs), where teams of people share tasks towards creating a quality artifact, such as academic research papers or software development in commercial or open source environments. Accomplishing those tasks involves both work, e.g., writing the papers or code, and communication, to discuss and coordinate. Taking into account the different types of activities and how they alternate over time can result in much more precise understanding of the TOSNs behaviors and outcomes. That calls for modeling techniques that can accommodate both node and link heterogeneity as well as temporal change. In this paper, we report on methodology for finding temporal motifs in TOSNs, limited to a system of two people and an artifact. We apply the methods to publicly available data of TOSNs from 31 Open Source Software projects. We find that these temporal motifs are enriched in the observed data. When applied to software development outcome, temporal motifs reveal a distinct dependency between collaboration and communication in the code writing process. Moreover, we show that models based on temporal motifs can be used to more precisely relate both individual developer centrality and team cohesion to programmer productivity than models based on aggregated TOSNs.

  4. Balancing Human-machine Interface (HMI) Design in Complex Supervisory Tasks

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Junsu [Khalifa Univ. of Science, Abu Dhabi (United Arab Emirates); Kim, Arryum; Jang, Inseok; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-05-15

    Human performance aspects such as plant performance, personnel task performance, situation awareness, cognitive workload, teamwork, and anthropomorphic/physiological factor are evaluated with the HUPESS. Even though the HUPESS provides evaluation results in each of the performance aspects for the integrated system validation (ISV), additional researches have been needed to develop methods on how to find out design deficiency leading to poor performance and give a solution for design improvement in HMI. The authors have developed a method of HMI design improvement for the monitoring and detection tasks which was named as 'DEMIS (Difficulty Evaluation Method in Information Searching)'. The DEMIS is a HMI evaluation method which bridge poor performance and design improvement. Lessons learned from the existing studies lead to a question about how to optimize the whole HMI design. Human factors principles provide the foundation for guidelines of various codes and standards in designing HMIs. Also in NPPs, a lot of guidelines directly from various codes and standard and derived from various research and development projects are available for designing MCR HMIs. In this study, a balancing principle and relevant two measures for HMI design optimization are proposed to be used in the HMI design of complex supervisory tasks in NPPs. The balancing principle is that a HMI element (e. g., an indicator or a push button) should be designed according to its importance.

  5. Computational modelling and analysis of hippocampal-prefrontal information coding during a spatial decision-making task

    Directory of Open Access Journals (Sweden)

    Thomas eJahans-Price

    2014-03-01

    Full Text Available We introduce a computational model describing rat behaviour and the interactions of neural populations processing spatial and mnemonic information during a maze-based, decision-making task. The model integrates sensory input and implements a working memory to inform decisions at a choice point, reproducing rat behavioural data and predicting the occurrence of turn- and memory-dependent activity in neuronal networks supporting task performance. We tested these model predictions using a new software toolbox (Maze Query Language, MQL to analyse activity of medial prefrontal cortical (mPFC and dorsal hippocampal (dCA1 neurons recorded from 6 adult rats during task performance. The firing rates of dCA1 neurons discriminated context (i.e. the direction of the previous turn, whilst a subset of mPFC neurons was selective for current turn direction or context, with some conjunctively encoding both. mPFC turn-selective neurons displayed a ramping of activity on approach to the decision turn and turn-selectivity in mPFC was significantly reduced during error trials. These analyses complement data from neurophysiological recordings in non-human primates indicating that firing rates of cortical neurons correlate with integration of sensory evidence used to inform decision-making.

  6. Communicating pictures a course in image and video coding

    CERN Document Server

    Bull, David R

    2014-01-01

    Communicating Pictures starts with a unique historical perspective of the role of images in communications and then builds on this to explain the applications and requirements of a modern video coding system. It draws on the author's extensive academic and professional experience of signal processing and video coding to deliver a text that is algorithmically rigorous, yet accessible, relevant to modern standards, and practical. It offers a thorough grounding in visual perception, and demonstrates how modern image and video compression methods can be designed in order to meet the rate-quality performance levels demanded by today's applications, networks and users. With this book you will learn: Practical issues when implementing a codec, such as picture boundary extension and complexity reduction, with particular emphasis on efficient algorithms for transforms, motion estimators and error resilience Conflicts between conventional video compression, based on variable length coding and spatiotemporal prediction,...

  7. An Efficient Integer Coding and Computing Method for Multiscale Time Segment

    Directory of Open Access Journals (Sweden)

    TONG Xiaochong

    2016-12-01

    Full Text Available This article focus on the exist problem and status of current time segment coding, proposed a new set of approach about time segment coding: multi-scale time segment integer coding (MTSIC. This approach utilized the tree structure and the sort by size formed among integer, it reflected the relationship among the multi-scale time segments: order, include/contained, intersection, etc., and finally achieved an unity integer coding processing for multi-scale time. On this foundation, this research also studied the computing method for calculating the time relationships of MTSIC, to support an efficient calculation and query based on the time segment, and preliminary discussed the application method and prospect of MTSIC. The test indicated that, the implement of MTSIC is convenient and reliable, and the transformation between it and the traditional method is convenient, it has the very high efficiency in query and calculating.

  8. Final Report on ITER Task Agreement 81-18

    Energy Technology Data Exchange (ETDEWEB)

    Brad J. Merrill

    2008-02-01

    During 2007, the US International Thermonuclear Experimental Reactor (ITER) Project Office (USIPO) entered into a Task Agreement (TA) with the ITER International Organization (IO) to conduct Research and Development activity and/or Design activity in the area of Safety Analyses. There were four tasks within this TA, which were to provide the ITER IO with: 1) Quality Assurance (QA) documentation for the MELCOR 1.8.2 Fusion code, 2) a pedigreed version of MELCOR 1.8.2, 3) assistance in MELCOR input deck development and accident analyses, and 4) support and assistance in the operation of the MELCOR 1.8.2. This report, which is the final report for this agreement, documents the completion of the work scope under this ITER TA, designated as TA 81-18.

  9. CFRX, a one-and-a-quarter-dimensional transport code for field-reversed configuration studies

    International Nuclear Information System (INIS)

    Hsiao Mingyuan

    1989-01-01

    A one-and-a-quarter-dimensional transport code, which includes radial as well as some two-dimensional effects for field-reversed configurations, is described. The set of transport equations is transformed to a set of new independent and dependent variables and is solved as a coupled initial-boundary value problem. The code simulation includes both the closed and open field regions. The axial effects incorporated include global axial force balance, axial losses in the open field region, and flux surface averaging over the closed field region. A typical example of the code results is also given. (orig.)

  10. Enhancing parallelism of tile bidiagonal transformation on multicore architectures using tree reduction

    KAUST Repository

    Ltaief, Hatem

    2012-01-01

    The objective of this paper is to enhance the parallelism of the tile bidiagonal transformation using tree reduction on multicore architectures. First introduced by Ltaief et. al [LAPACK Working Note #247, 2011], the bidiagonal transformation using tile algorithms with a two-stage approach has shown very promising results on square matrices. However, for tall and skinny matrices, the inherent problem of processing the panel in a domino-like fashion generates unnecessary sequential tasks. By using tree reduction, the panel is horizontally split, which creates another dimension of parallelism and engenders many concurrent tasks to be dynamically scheduled on the available cores. The results reported in this paper are very encouraging. The new tile bidiagonal transformation, targeting tall and skinny matrices, outperforms the state-of-the-art numerical linear algebra libraries LAPACK V3.2 and Intel MKL ver. 10.3 by up to 29-fold speedup and the standard two-stage PLASMA BRD by up to 20-fold speedup, on an eight socket hexa-core AMD Opteron multicore shared-memory system. © 2012 Springer-Verlag.

  11. A radiological characterization extension for the DORIAN code - Summer Student Report

    CERN Document Server

    van Hoorn, Isabelle

    2016-01-01

    During my stay at CERN as a summer student I was working in the Radiation Protection group. The primary task of my project was to expand the functionality of the DORIAN code that is used for the prediction and analysis of residual dose rates due to accelerator radiation induced activation. With the guidance of my supervisor I extended the framework of the DORIAN code to include a radiological classification scheme that is able to compute mass specific activities for a given irradiation profile and cool-down time and compare these specific activities to given waste characterization limit sets . Additionally, the DORIAN code extension can compute the cool-down time required to stay within a certain limit set threshold for a fixed irradiation profile

  12. The Role of the International Code Council in the U.S. Building Regukation System and Green Building Contruction

    OpenAIRE

    David Walls

    2015-01-01

    This paper will address the components of the International Code Council (ICC) as one of the most important organizations in terms of developing the model building codes for the US: the international codes. This membership-driven organization has the task of providing the building industry and all its stakeholders with the necessary regulatory documents, training, certification, plan check, product evaluation, and accreditation services to achieve safer and more sustainable building construct...

  13. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  14. Empirical validation of the triple-code model of numerical processing for complex math operations using functional MRI and group Independent Component Analysis of the mental addition and subtraction of fractions.

    Science.gov (United States)

    Schmithorst, Vincent J; Brown, Rhonda Douglas

    2004-07-01

    The suitability of a previously hypothesized triple-code model of numerical processing, involving analog magnitude, auditory verbal, and visual Arabic codes of representation, was investigated for the complex mathematical task of the mental addition and subtraction of fractions. Functional magnetic resonance imaging (fMRI) data from 15 normal adult subjects were processed using exploratory group Independent Component Analysis (ICA). Separate task-related components were found with activation in bilateral inferior parietal, left perisylvian, and ventral occipitotemporal areas. These results support the hypothesized triple-code model corresponding to the activated regions found in the individual components and indicate that the triple-code model may be a suitable framework for analyzing the neuropsychological bases of the performance of complex mathematical tasks. Copyright 2004 Elsevier Inc.

  15. Cell-assembly coding in several memory processes.

    Science.gov (United States)

    Sakurai, Y

    1998-01-01

    The present paper discusses why the cell assembly, i.e., an ensemble population of neurons with flexible functional connections, is a tenable view of the basic code for information processes in the brain. The main properties indicating the reality of cell-assembly coding are neurons overlaps among different assemblies and connection dynamics within and among the assemblies. The former can be detected as multiple functions of individual neurons in processing different kinds of information. Individual neurons appear to be involved in multiple information processes. The latter can be detected as changes of functional synaptic connections in processing different kinds of information. Correlations of activity among some of the recorded neurons appear to change in multiple information processes. Recent experiments have compared several different memory processes (tasks) and detected these two main properties, indicating cell-assembly coding of memory in the working brain. The first experiment compared different types of processing of identical stimuli, i.e., working memory and reference memory of auditory stimuli. The second experiment compared identical processes of different types of stimuli, i.e., discriminations of simple auditory, simple visual, and configural auditory-visual stimuli. The third experiment compared identical processes of different types of stimuli with or without temporal processing of stimuli, i.e., discriminations of elemental auditory, configural auditory-visual, and sequential auditory-visual stimuli. Some possible features of the cell-assembly coding, especially "dual coding" by individual neurons and cell assemblies, are discussed for future experimental approaches. Copyright 1998 Academic Press.

  16. Subband Coding Methods for Seismic Data Compression

    Science.gov (United States)

    Kiely, A.; Pollara, F.

    1995-01-01

    This paper presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The compression technique described could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  17. Coding the presence of visual objects in a recurrent neural network of visual cortex.

    Science.gov (United States)

    Zwickel, Timm; Wachtler, Thomas; Eckhorn, Reinhard

    2007-01-01

    Before we can recognize a visual object, our visual system has to segregate it from its background. This requires a fast mechanism for establishing the presence and location of objects independently of their identity. Recently, border-ownership neurons were recorded in monkey visual cortex which might be involved in this task [Zhou, H., Friedmann, H., von der Heydt, R., 2000. Coding of border ownership in monkey visual cortex. J. Neurosci. 20 (17), 6594-6611]. In order to explain the basic mechanisms required for fast coding of object presence, we have developed a neural network model of visual cortex consisting of three stages. Feed-forward and lateral connections support coding of Gestalt properties, including similarity, good continuation, and convexity. Neurons of the highest area respond to the presence of an object and encode its position, invariant of its form. Feedback connections to the lowest area facilitate orientation detectors activated by contours belonging to potential objects, and thus generate the experimentally observed border-ownership property. This feedback control acts fast and significantly improves the figure-ground segregation required for the consecutive task of object recognition.

  18. CEA distribution transformer purchasing specifications (DTWG-01,02,03)

    International Nuclear Information System (INIS)

    Fischer, M.

    1999-01-01

    Purchasing specifications for three types of distribution transformers are presented. The specifications were compiled by the Canadian Electricity Association at the suggestion made in 1989 by the Canadian Utilities Material Management Group. The specifications cover pole mounted single phase distribution transformers (DTWG-01), low-profile, single phase, dead-front pad-mounted distribution transformers (DTWG-02), and three phase, dead-front pad-mounted distribution transformers (DTWG-03). The specifications were compiled by a task force of CEA member utilities, using CSA standards as the governing standards in all three cases. The first edition of the purchasing specifications was issued in 1993. The second edition, consisting mainly of revisions based on experiences learned from using the first edition, and the addition of the appropriate clauses of CSA standards, was published in 1998. Based on a three-year average of the number of transformers purchased annually (about 58,000) at an estimated total cost of $ 120 million, use of the Purchasing Specifications is said to have resulted in savings of about 7 per cent or $ 8.4 million

  19. Application of an accurate thermal hydraulics solver in VTT's reactor dynamics codes

    International Nuclear Information System (INIS)

    Rajamaeki, M.; Raety, H.; Kyrki-Rajamaeki, R.; Eskola, M.

    1998-01-01

    VTT's reactor dynamics codes are developed further and new more detailed models are created for tasks related to increased safety requirements. For thermal hydraulics calculations an accurate general flow model based on a new solution method PLIM has been developed. It has been applied in VTT's one-dimensional TRAB and three-dimensional HEXTRAN codes. Results of a demanding international boron dilution benchmark defined by VTT are given and compared against results of other codes with original or improved boron tracking. The new PLIM method not only allows the accurate modelling of a propagating boron dilution front, but also the tracking of a temperature front, which is missed by the special boron tracking models. (orig.)

  20. Unitary Application of the Quantum Error Correction Codes

    International Nuclear Information System (INIS)

    You Bo; Xu Ke; Wu Xiaohua

    2012-01-01

    For applying the perfect code to transmit quantum information over a noise channel, the standard protocol contains four steps: the encoding, the noise channel, the error-correction operation, and the decoding. In present work, we show that this protocol can be simplified. The error-correction operation is not necessary if the decoding is realized by the so-called complete unitary transformation. We also offer a quantum circuit, which can correct the arbitrary single-qubit errors.

  1. Military involvement in post-conflict transformation in African peace ...

    African Journals Online (AJOL)

    Post-conflict transformation is a difficult task, since renewed violence frequently flares up after peace treaties have been signed. Failure to end conflict often results from misinterpretations of the roots or an inability of the conflict to create suitable exit strategies for military forces. Reintegration of soldiers and non-state armed ...

  2. Operational Transformation In Co-Operative Editing

    Directory of Open Access Journals (Sweden)

    Mandeep Kaur

    2015-08-01

    Full Text Available Cooperative Editing Systems in real-time allows a virtual team to view and edit a shared document at the same time. The document shared must be synchronized in order to ensure consistency for all the participants. This paper describes the Operational Transformation the evolution of its techniques its various applications major issues and achievements. In addition this paper will present working of a platform where two users can edit a code programming file at the same time.

  3. A Cognitive Analysis of Armor Procedural Task Training

    Science.gov (United States)

    1982-03-01

    Verbal Behavior, 8, 323-343. Craik , F. I. M., & Lockhart , R. S. (1972). Levels of processing : A framework for memory research. Journal of Verbal Learning...concep- tual or meaningful) coding of the task to be learned (e.g., Bjork, 1975; Craik & Lockhart , 1972; Melton & Martin, 1972). In order to remember a...were several serious problems with applying this approach in the context of entry- level military training. In particular, the soldier did not always

  4. Multiple Object Permanence Tracking: Maintenance, Retrieval and Transformation of Dynamic Object Representations

    OpenAIRE

    Saiki, Jun

    2008-01-01

    Multiple object permanence tracking (MOPT) task revealed that our ability of maintaining and transforming multiple representations of complex feature-bound objects is limited to handle only 1-2 objects. Often reported capacity of 3-5 objects likely reflects memory for partial representations of objects and simple cases such as just color and their locations. Also, performance in multiple object tracking (MOT) task is likely mediated by spatiotemporal indices, not by feature-bound object repre...

  5. The structure of affective action representations: temporal binding of affective response codes.

    Science.gov (United States)

    Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard

    2012-01-01

    Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.

  6. LabVIEW Task Manager v. 1.10.0

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-21

    LabVIEW Task Manager is a debugging tool for use during code development in the National Instruments (NI) LabVIEW® IDE. While providing a dynamic & big-picture view of running code, an expandable/collapsible tree diagram displays detailed information (both static and dynamic) on all VIs in memory, belonging to a selected project/target. It allows for interacting with single or multiple selected VIs at a time, providing significant benefits while troubleshooting, and has the following features: Look & Feel similar to Windows® Task Manager; Selection of project/target; Lists all VIs in memory, grouped by class/library; Searches for and enumerates clones in memory; DropIn VI for including dynamically referenced clones (Clone Beacon); 'Refresh Now' (F5) re-reads all VIs in memory and adds new ones to the tree; Displays VI name, owning class/library, state, path, data size & code size; Displays VI FP Behavior, Reentrant?, Reentrancy Type, Paused? & Highlight?; Sort by any column, including by library name; Filter by item types vi, ctl, and vit/ctt; Filter out vi.lib and global VIs; Tracking of, and ability to toggle, execution highlighting on multiple selected VIs; Tracking of paused VIs with ability to Pause/Resume/TogglePause multiple selected VIs; DropIn VI for pausing on a condition; If a clone initiates a pause, a different pause symbol is used for all clones of that same reentrant original VI; Select multiple VIs and open or close their FPs or BDs; Double Click a VI from the tree to bring the BD (first choice) or FP to front, if already open; and Select multiple top-level VIs and Abort them.

  7. Correction of the significance level when attempting multiple transformations of an explanatory variable in generalized linear models

    Science.gov (United States)

    2013-01-01

    Background In statistical modeling, finding the most favorable coding for an exploratory quantitative variable involves many tests. This process involves multiple testing problems and requires the correction of the significance level. Methods For each coding, a test on the nullity of the coefficient associated with the new coded variable is computed. The selected coding corresponds to that associated with the largest statistical test (or equivalently the smallest pvalue). In the context of the Generalized Linear Model, Liquet and Commenges (Stat Probability Lett,71:33–38,2005) proposed an asymptotic correction of the significance level. This procedure, based on the score test, has been developed for dichotomous and Box-Cox transformations. In this paper, we suggest the use of resampling methods to estimate the significance level for categorical transformations with more than two levels and, by definition those that involve more than one parameter in the model. The categorical transformation is a more flexible way to explore the unknown shape of the effect between an explanatory and a dependent variable. Results The simulations we ran in this study showed good performances of the proposed methods. These methods were illustrated using the data from a study of the relationship between cholesterol and dementia. Conclusion The algorithms were implemented using R, and the associated CPMCGLM R package is available on the CRAN. PMID:23758852

  8. Transforming Systems Engineering through Model-Centric Engineering

    Science.gov (United States)

    2018-02-28

    Contract No. HQ0034-13-D-0004 Research Tasks: 48, 118, 141, 157, 170 Report No. SERC-2018-TR-103 Transforming Systems Engineering through...Model-Centric Engineering Technical Report SERC-2018-TR-103 February 28, 2018 Principal Investigator Dr. Mark Blackburn, Stevens Institute of...Systems Engineering Research Center This material is based upon work supported, in whole or in part, by the U.S. Department of Defense through the

  9. Progress of laser-plasma interaction simulations with the particle-in-cell code

    International Nuclear Information System (INIS)

    Sakagami, Hitoshi; Kishimoto, Yasuaki; Sentoku, Yasuhiko; Taguchi, Toshihiro

    2005-01-01

    As the laser-plasma interaction is a non-equilibrium, non-linear and relativistic phenomenon, we must introduce a microscopic method, namely, the relativistic electromagnetic PIC (Particle-In-Cell) simulation code. The PIC code requires a huge number of particles to validate simulation results, and its task is very computation-intensive. Thus simulation researches by the PIC code have been progressing along with advances in computer technology. Recently, parallel computers with tremendous computational power have become available, and thus we can perform three-dimensional PIC simulations for the laser-plasma interaction to investigate laser fusion. Some simulation results are shown with figures. We discuss a recent trend of large-scale PIC simulations that enable direct comparison between experimental facts and computational results. We also discharge/lightning simulations by the extended PIC code, which include various atomic and relaxation processes. (author)

  10. Coding Local and Global Binary Visual Features Extracted From Video Sequences.

    Science.gov (United States)

    Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2015-11-01

    Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the bag-of-visual word model. Several applications, including, for example, visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget while attaining a target level of efficiency. In this paper, we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can conveniently be adopted to support the analyze-then-compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs the visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the compress-then-analyze (CTA) paradigm. In this paper, we experimentally compare the ATC and the CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: 1) homography estimation and 2) content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with the CTA, especially in bandwidth limited scenarios.

  11. Exploring item and higher order factor structure with the Schmid-Leiman solution: syntax codes for SPSS and SAS.

    Science.gov (United States)

    Wolff, Hans-Georg; Preising, Katja

    2005-02-01

    To ease the interpretation of higher order factor analysis, the direct relationships between variables and higher order factors may be calculated by the Schmid-Leiman solution (SLS; Schmid & Leiman, 1957). This simple transformation of higher order factor analysis orthogonalizes first-order and higher order factors and thereby allows the interpretation of the relative impact of factor levels on variables. The Schmid-Leiman solution may also be used to facilitate theorizing and scale development. The rationale for the procedure is presented, supplemented by syntax codes for SPSS and SAS, since the transformation is not part of most statistical programs. Syntax codes may also be downloaded from www.psychonomic.org/archive/.

  12. MRI to X-ray mammography intensity-based registration with simultaneous optimisation of pose and biomechanical transformation parameters.

    Science.gov (United States)

    Mertzanidou, Thomy; Hipwell, John; Johnsen, Stian; Han, Lianghao; Eiben, Bjoern; Taylor, Zeike; Ourselin, Sebastien; Huisman, Henkjan; Mann, Ritse; Bick, Ulrich; Karssemeijer, Nico; Hawkes, David

    2014-05-01

    Determining corresponding regions between an MRI and an X-ray mammogram is a clinically useful task that is challenging for radiologists due to the large deformation that the breast undergoes between the two image acquisitions. In this work we propose an intensity-based image registration framework, where the biomechanical transformation model parameters and the rigid-body transformation parameters are optimised simultaneously. Patient-specific biomechanical modelling of the breast derived from diagnostic, prone MRI has been previously used for this task. However, the high computational time associated with breast compression simulation using commercial packages, did not allow the optimisation of both pose and FEM parameters in the same framework. We use a fast explicit Finite Element (FE) solver that runs on a graphics card, enabling the FEM-based transformation model to be fully integrated into the optimisation scheme. The transformation model has seven degrees of freedom, which include parameters for both the initial rigid-body pose of the breast prior to mammographic compression, and those of the biomechanical model. The framework was tested on ten clinical cases and the results were compared against an affine transformation model, previously proposed for the same task. The mean registration error was 11.6±3.8mm for the CC and 11±5.4mm for the MLO view registrations, indicating that this could be a useful clinical tool. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Breaking new ground in the mind: an initial study of mental brittle transformation and mental rigid rotation in science experts.

    Science.gov (United States)

    Resnick, Ilyse; Shipley, Thomas F

    2013-05-01

    The current study examines the spatial skills employed in different spatial reasoning tasks, by asking how science experts who are practiced in different types of visualizations perform on different spatial tasks. Specifically, the current study examines the varieties of mental transformations. We hypothesize that there may be two broad classes of mental transformations: rigid body mental transformations and non-rigid mental transformations. We focus on the disciplines of geology and organic chemistry because different types of transformations are central to the two disciplines: While geologists and organic chemists may both confront rotation in the practice of their profession, only geologists confront brittle transformations. A new instrument was developed to measure mental brittle transformation (visualizing breaking). Geologists and organic chemists performed similarly on a measure of mental rotation, while geologists outperformed organic chemists on the mental brittle transformation test. The differential pattern of skill on the two tests for the two groups of experts suggests that mental brittle transformation and mental rotation are different spatial skills. The roles of domain general cognitive resources (attentional control, spatial working memory, and perceptual filling in) and strategy in completing mental brittle transformation are discussed. The current study illustrates how ecological and interdisciplinary approaches complement traditional cognitive science to offer a comprehensive approach to understanding the nature of spatial thinking.

  14. Facial Expression Recognition via Non-Negative Least-Squares Sparse Coding

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2014-05-01

    Full Text Available Sparse coding is an active research subject in signal processing, computer vision, and pattern recognition. A novel method of facial expression recognition via non-negative least squares (NNLS sparse coding is presented in this paper. The NNLS sparse coding is used to form a facial expression classifier. To testify the performance of the presented method, local binary patterns (LBP and the raw pixels are extracted for facial feature representation. Facial expression recognition experiments are conducted on the Japanese Female Facial Expression (JAFFE database. Compared with other widely used methods such as linear support vector machines (SVM, sparse representation-based classifier (SRC, nearest subspace classifier (NSC, K-nearest neighbor (KNN and radial basis function neural networks (RBFNN, the experiment results indicate that the presented NNLS method performs better than other used methods on facial expression recognition tasks.

  15. Contributions to the validation of the ASTEC V1 code

    International Nuclear Information System (INIS)

    Constantin, Marin; Rizoiu, Andrei; Turcu, Ilie

    2004-01-01

    In the frame of PHEBEN2 project (Validation of the severe accidents codes for applications to nuclear power plants, based on the PHEBUS FP experiments), a project developed within the EU research Frame Program 5 (FP5), the INR-Pitesti's team has received the task of determining the ASTEC code sensitivity. The PHEBEN2 project has been initiated in 1998 and gathered 13 partners from 6 EU member states. To the project 4 partners from 3 candidate states (Hungary, Bulgaria and Romania) joined later. The works were contracted with the European Commission (under FIKS-CT1999-00009 contract) that supports financially the research effort up to about 50%. According to the contract provisions, INR's team participated in developing the Working Package 1 (WP1) which refers to validation of the integral computation codes that use the PHOEBUS experimental data and the Working Package 3 (WP3) referring to the evaluation of the codes to be applied in nuclear power plants for risk evaluation, nuclear safety margin evaluation and determination/evaluation of the measures to be adopted in case of severe accident. The present work continues the efforts to validate preliminarily the ASTEC code. Focused are the the stand-alone sensitivity analyses applied to two most important modules of the code, namely DIVA and SOPHAEROS

  16. Agrobacterium-mediated transformation of Easter lily (Lilium longiflorum cv. Nellie White)

    Science.gov (United States)

    Conditions were optimized for transient transformation of Lilium longiflorum cv. Nellie White using Agrobacterium tumefaciens. Bulb scale and basal meristem explants were inoculated with A. tumefaciens strain AGL1 containing the binary vector pCAMBIA 2301 which has the uidA gene that codes for ß-gl...

  17. ANNarchy: a code generation approach to neural simulations on parallel hardware

    Science.gov (United States)

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  18. Lossless Image Compression Based on Multiple-Tables Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Rung-Ching Chen

    2009-01-01

    Full Text Available This paper is intended to present a lossless image compression method based on multiple-tables arithmetic coding (MTAC method to encode a gray-level image f. First, the MTAC method employs a median edge detector (MED to reduce the entropy rate of f. The gray levels of two adjacent pixels in an image are usually similar. A base-switching transformation approach is then used to reduce the spatial redundancy of the image. The gray levels of some pixels in an image are more common than those of others. Finally, the arithmetic encoding method is applied to reduce the coding redundancy of the image. To promote high performance of the arithmetic encoding method, the MTAC method first classifies the data and then encodes each cluster of data using a distinct code table. The experimental results show that, in most cases, the MTAC method provides a higher efficiency in use of storage space than the lossless JPEG2000 does.

  19. Exploring Asynchronous Many-Task Runtime Systems toward Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Knight, Samuel [O8953; Baker, Gavin Matthew; Gamell, Marc [Rutgers U; Hollman, David [08953; Sjaardema, Gregor [SNL; Kolla, Hemanth [SNL; Teranishi, Keita; Wilke, Jeremiah J; Slattengren, Nicole [SNL; Bennett, Janine Camille

    2015-10-01

    Major exascale computing reports indicate a number of software challenges to meet the dramatic change of system architectures in near future. While several-orders-of-magnitude increase in parallelism is the most commonly cited of those, hurdles also include performance heterogeneity of compute nodes across the system, increased imbalance between computational capacity and I/O capabilities, frequent system interrupts, and complex hardware architectures. Asynchronous task-parallel programming models show a great promise in addressing these issues, but are not yet fully understood nor developed su ciently for computational science and engineering application codes. We address these knowledge gaps through quantitative and qualitative exploration of leading candidate solutions in the context of engineering applications at Sandia. In this poster, we evaluate MiniAero code ported to three leading candidate programming models (Charm++, Legion and UINTAH) to examine the feasibility of these models that permits insertion of new programming model elements into an existing code base.

  20. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  1. An Infrastructure for UML-Based Code Generation Tools

    Science.gov (United States)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  2. Minimizing embedding impact in steganography using trellis-coded quantization

    Science.gov (United States)

    Filler, Tomáš; Judas, Jan; Fridrich, Jessica

    2010-01-01

    In this paper, we propose a practical approach to minimizing embedding impact in steganography based on syndrome coding and trellis-coded quantization and contrast its performance with bounds derived from appropriate rate-distortion bounds. We assume that each cover element can be assigned a positive scalar expressing the impact of making an embedding change at that element (single-letter distortion). The problem is to embed a given payload with minimal possible average embedding impact. This task, which can be viewed as a generalization of matrix embedding or writing on wet paper, has been approached using heuristic and suboptimal tools in the past. Here, we propose a fast and very versatile solution to this problem that can theoretically achieve performance arbitrarily close to the bound. It is based on syndrome coding using linear convolutional codes with the optimal binary quantizer implemented using the Viterbi algorithm run in the dual domain. The complexity and memory requirements of the embedding algorithm are linear w.r.t. the number of cover elements. For practitioners, we include detailed algorithms for finding good codes and their implementation. Finally, we report extensive experimental results for a large set of relative payloads and for different distortion profiles, including the wet paper channel.

  3. Measuring and test equipment control through bar-code technology

    International Nuclear Information System (INIS)

    Crockett, J.D.; Carr, C.C.

    1993-01-01

    Over the past several years, the use, tracking, and documentation of measuring and test equipment (M ampersand TE) has become a major issue. New regulations are forcing companies to develop new policies for providing use history, traceability, and accountability of M ampersand TE. This paper discusses how the Fast Flux Test Facility (FFTF), operated by Westinghouse Hanford Company and located at the Hanford site in Rich- land, Washington, overcame these obstacles by using a computerized system exercising bar-code technology. A data base was developed to identify M ampersand TE containing 33 separate fields, such as manufacturer, model, range, bar-code number, and other pertinent information. A bar-code label was attached to each piece of M ampersand TE. A second data base was created to identify the employee using the M ampersand TE. The fields contained pertinent user information such as name, location, and payroll number. Each employee's payroll number was bar coded and attached to the back of their identification badge. A computer program was developed to automate certain tasks previously performed and tracked by hand. Bar-code technology was combined with this computer program to control the input and distribution of information, eliminate common mistakes, electronically store information, and reduce the time required to check out the M ampersand TE for use

  4. Transformation of a Spatial Map across the Hippocampal-Lateral Septal Circuit.

    Science.gov (United States)

    Tingley, David; Buzsáki, György

    2018-05-15

    The hippocampus constructs a map of the environment. How this "cognitive map" is utilized by other brain regions to guide behavior remains unexplored. To examine how neuronal firing patterns in the hippocampus are transmitted and transformed, we recorded neurons in its principal subcortical target, the lateral septum (LS). We observed that LS neurons carry reliable spatial information in the phase of action potentials, relative to hippocampal theta oscillations, while the firing rates of LS neurons remained uninformative. Furthermore, this spatial phase code had an anatomical microstructure within the LS and was bound to the hippocampal spatial code by synchronous gamma frequency cell assemblies. Using a data-driven model, we show that rate-independent spatial tuning arises through the dynamic weighting of CA1 and CA3 cell assemblies. Our findings demonstrate that transformation of the hippocampal spatial map depends on higher-order theta-dependent neuronal sequences. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Reward Anticipation in Ventral Striatum and Individual Sensitivity to Reward : A Pilot Study of a Child-Friendly fMRI Task

    NARCIS (Netherlands)

    van Hulst, Branko M; de Zeeuw, Patrick; Lupas, Kellina; Bos, Dienke J; Neggers, Sebastiaan F W; Durston, Sarah

    2015-01-01

    Reward processing has been implicated in developmental disorders. However, the classic task to probe reward anticipation, the monetary incentive delay task, has an abstract coding of reward and no storyline and may therefore be less appropriate for use with developmental populations. We modified the

  6. Observational Measures of Parenting in Anxious and Nonanxious Mothers: Does Type of Task Matter?

    Science.gov (United States)

    Ginsburg, Golda S.; Grover, Rachel L.; Cord, Jennalee J.; Ialongo, Nick

    2012-01-01

    This study examined the relation between type of parent–child interaction task and parenting behaviors among a predominantly African American community-based sample. Twenty-five anxious and matched nonanxious (N = 50) mothers were videotaped with their children (Mage = 5.8 years) engaging in both a structured and unstructured task. Blind raters coded 3 parent behaviors hypothesized to play a role in the development of child anxiety: overcontrol, anxious behavior, and criticism. Results indicated that higher levels of overcontrol, anxious behavior, and criticism were found in the structured compared to unstructured task. Levels of criticism, among anxious mothers only, were significantly correlated across tasks. Results suggest that situation specific aspects of parent–child interaction tasks may influence parenting behaviors. These findings help explain variations in observational research in the anxiety literature and highlight the need for careful selection ofparent–child tasks in future research. PMID:16597228

  7. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  8. TUTANK a two-dimensional neutron kinetics code

    International Nuclear Information System (INIS)

    Watts, M.G.; Halsall, M.J.; Fayers, F.J.

    1975-04-01

    TUTANK is a two-dimensional neutron kinetics code which treats two neutron energy groups and up to six groups of delayed neutron precursors. A 'theta differencing' method is used to integrate the time dependence of the equations. A position dependent exponential transformation on the time variable is available as an option, which in many circumstances can remove much of the time dependence, and thereby allow longer time steps to be taken. A further manipulation is made to separate the solutions of the neutron fluxes and the precursor concentrations. The spatial equations are based on standard diffusion theory, and their solution is obtained from alternating direction sweeps with a transverse buckling - the so-called ADI-B 2 method. Other features of the code include an elementary temperature feedback and heat removal treatment, automatic time step adjustment, a flexible method of specifying cross-section and heat transfer coefficient variations during a transient, and a restart facility which requires a minimal data specification. Full details of the code input are given. An example of the solution of a NEACRP benchmark for an LWR control rod withdrawal is given. (author)

  9. Nonoccurrence of Negotiation of Meaning in Task-Based Synchronous Computer-Mediated Communication

    Science.gov (United States)

    Van Der Zwaard, Rose; Bannink, Anne

    2016-01-01

    This empirical study investigated the occurrence of meaning negotiation in an interactive synchronous computer-mediated second language (L2) environment. Sixteen dyads (N = 32) consisting of nonnative speakers (NNSs) and native speakers (NSs) of English performed 2 different tasks using videoconferencing and written chat. The data were coded and…

  10. Polarity Correspondence: A General Principle for Performance of Speeded Binary Classification Tasks

    Science.gov (United States)

    Proctor, Robert W.; Cho, Yang Seok

    2006-01-01

    Differences in performance with various stimulus-response mappings are among the most prevalent findings for binary choice reaction tasks. The authors show that perceptual or conceptual similarity is not necessary to obtain mapping effects; a type of structural similarity is sufficient. Specifically, stimulus and response alternatives are coded as…

  11. On-line monitoring and inservice inspection in codes; Betriebsueberwachung und wiederkehrende Pruefungen in den Regelwerken

    Energy Technology Data Exchange (ETDEWEB)

    Bartonicek, J.; Zaiss, W. [Gemeinschaftskernkraftwerk Neckar GmbH, Neckarwestheim (Germany); Bath, H.R. [Bundesamt fuer Strahlenschutz, Salzgitter (Germany). Geschaeftsstelle des Kerntechnischen Ausschusses (KTA)

    1999-08-01

    The relevant regulatory codes determine the ISI tasks and the time intervals for recurrent components testing for evaluation of operation-induced damaging or ageing in order to ensure component integrity on the basis of the last available quality data. In-service quality monitoring is carried out through on-line monitoring and recurrent testing. The requirements defined by the engineering codes elaborated by various institutions are comparable, with the KTA nuclear engineering and safety codes being the most complete provisions for quality evaluation and assurance after different, defined service periods. German conventional codes for assuring component integrity provide exclusively for recurrent inspection regimes (mainly pressure tests and optical testing). The requirements defined in the KTA codes however always demanded more specific inspections relying on recurrent testing as well as on-line monitoring. Foreign codes for ensuring component integrity concentrate on NDE tasks at regular time intervals, with time intervals scope of testing activities being defined on the basis of the ASME code, section XI. (orig./CB) [Deutsch] Fuer die Komponentenintegritaet sind die Schaedigungsmechanismen mit dem nach den Regelwerken einzuhaltenden Abstand abzusichern. Dabei ist die jeweils vorhandene (Ist-) Qualitaet als Ausgangspunkt entscheidend. Die Absicherung der vorhandenen Qualitaet im weiteren Betrieb erfolgt durch geeignete Betriebsueberwachung und wiederkehrende Pruefungen. Die Anforderungen der Regelwerke sind vergleichbar, wobei die Bestimmung der vorhandenen Qualitaet nach einer bestimmten Betriebszeit sowie deren Absicherung im weiteren Betrieb am vollstaendigsten auf Basis des KTA-Regelwerkes moeglich ist. Die Absicherung der Komponentenintegritaet im Betrieb beruht in deutschen konventionellen Regelwerken nur auf den wiederkehrenden Pruefungen (hauptsaechlich Druckpruefungen und Sichtpruefungen). Das KTA-Regelwerk forderte hier schon immer qualifizierte

  12. Application of Chimera Navier-Stokes Code for High Speed Flows

    Science.gov (United States)

    Ajmani, Kumud

    1997-01-01

    The primary task for this year was performed in support of the "Trailblazer" project. The purpose of the task was to perform an extensive CFD study of the shock boundary-layer interaction between the engine-diverters and the primary body surfaces of the Trailblazer vehicle. Information gathered from this study would be used to determine the effectiveness of the diverters in preventing the boundary-layer coming off of the vehicle forebody from entering the main engines. The PEGSUS code was used to define the "holes" and "boundaries" for each grid. Two sets of CFD calculations were performed.Extensive post-processing of the results was performed.

  13. Analyzing Array Manipulating Programs by Program Transformation

    Science.gov (United States)

    Cornish, J. Robert M.; Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.

    2014-01-01

    We explore a transformational approach to the problem of verifying simple array-manipulating programs. Traditionally, verification of such programs requires intricate analysis machinery to reason with universally quantified statements about symbolic array segments, such as "every data item stored in the segment A[i] to A[j] is equal to the corresponding item stored in the segment B[i] to B[j]." We define a simple abstract machine which allows for set-valued variables and we show how to translate programs with array operations to array-free code for this machine. For the purpose of program analysis, the translated program remains faithful to the semantics of array manipulation. Based on our implementation in LLVM, we evaluate the approach with respect to its ability to extract useful invariants and the cost in terms of code size.

  14. Hardware-efficient bosonic quantum error-correcting codes based on symmetry operators

    Science.gov (United States)

    Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.

    2018-03-01

    We establish a symmetry-operator framework for designing quantum error-correcting (QEC) codes based on fundamental properties of the underlying system dynamics. Based on this framework, we propose three hardware-efficient bosonic QEC codes that are suitable for χ(2 )-interaction based quantum computation in multimode Fock bases: the χ(2 ) parity-check code, the χ(2 ) embedded error-correcting code, and the χ(2 ) binomial code. All of these QEC codes detect photon-loss or photon-gain errors by means of photon-number parity measurements, and then correct them via χ(2 ) Hamiltonian evolutions and linear-optics transformations. Our symmetry-operator framework provides a systematic procedure for finding QEC codes that are not stabilizer codes, and it enables convenient extension of a given encoding to higher-dimensional qudit bases. The χ(2 ) binomial code is of special interest because, with m ≤N identified from channel monitoring, it can correct m -photon-loss errors, or m -photon-gain errors, or (m -1 )th -order dephasing errors using logical qudits that are encoded in O (N ) photons. In comparison, other bosonic QEC codes require O (N2) photons to correct the same degree of bosonic errors. Such improved photon efficiency underscores the additional error-correction power that can be provided by channel monitoring. We develop quantum Hamming bounds for photon-loss errors in the code subspaces associated with the χ(2 ) parity-check code and the χ(2 ) embedded error-correcting code, and we prove that these codes saturate their respective bounds. Our χ(2 ) QEC codes exhibit hardware efficiency in that they address the principal error mechanisms and exploit the available physical interactions of the underlying hardware, thus reducing the physical resources required for implementing their encoding, decoding, and error-correction operations, and their universal encoded-basis gate sets.

  15. How and Why Do Number-Space Associations Co-Vary in Implicit and Explicit Magnitude Processing Tasks?

    Directory of Open Access Journals (Sweden)

    Carrie Georges

    2017-12-01

    Full Text Available Evidence for number-space associations in implicit and explicit magnitude processing tasks comes from the parity and magnitude SNARC effect respectively. Different spatial accounts were suggested to underlie these spatial-numerical associations (SNAs with some inconsistencies in the literature. To determine whether the parity and magnitude SNAs arise from a single predominant account or task-dependent coding mechanisms, we adopted an individual differences approach to study their correlation and the extent of their association with arithmetic performance, spatial visualization ability and visualization profile. Additionally, we performed moderation analyses to determine whether the relation between these SNAs depended on individual differences in those cognitive factors. The parity and magnitude SNAs did not correlate and were differentially predicted by arithmetic performance and visualization profile respectively. These variables, however, also moderated the relation between the SNAs. While positive correlations were observed in object-visualizers with lower arithmetic performances, correlations were negative in spatial-visualizers with higher arithmetic performances. This suggests the predominance of a single account for both implicit and explicit SNAs in the two types of visualizers. However, the spatial nature of the account differs between object- and spatial-visualizers. No relation occurred in mixed-visualizers, indicating the activation of task-dependent coding mechanisms. Individual differences in arithmetic performance and visualization profile thus determined whether SNAs in implicit and explicit tasks co-varied and supposedly relied on similar or unrelated spatial coding mechanisms. This explains some inconsistencies in the literature regarding SNAs and highlights the usefulness of moderation analyses for understanding how the relation between different numerical concepts varies between individuals.

  16. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint

    Directory of Open Access Journals (Sweden)

    Zhi Gao

    2018-05-01

    Full Text Available Light detection and ranging (LiDAR sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs and unmanned aerial vehicles (UAVs to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.

  17. Fast Sparse Coding for Range Data Denoising with Sparse Ridges Constraint.

    Science.gov (United States)

    Gao, Zhi; Lao, Mingjie; Sang, Yongsheng; Wen, Fei; Ramesh, Bharath; Zhai, Ruifang

    2018-05-06

    Light detection and ranging (LiDAR) sensors have been widely deployed on intelligent systems such as unmanned ground vehicles (UGVs) and unmanned aerial vehicles (UAVs) to perform localization, obstacle detection, and navigation tasks. Thus, research into range data processing with competitive performance in terms of both accuracy and efficiency has attracted increasing attention. Sparse coding has revolutionized signal processing and led to state-of-the-art performance in a variety of applications. However, dictionary learning, which plays the central role in sparse coding techniques, is computationally demanding, resulting in its limited applicability in real-time systems. In this study, we propose sparse coding algorithms with a fixed pre-learned ridge dictionary to realize range data denoising via leveraging the regularity of laser range measurements in man-made environments. Experiments on both synthesized data and real data demonstrate that our method obtains accuracy comparable to that of sophisticated sparse coding methods, but with much higher computational efficiency.

  18. The CHEASE code for toroidal MHD equilibria

    International Nuclear Information System (INIS)

    Luetjens, H.

    1996-03-01

    CHEASE solves the Grad-Shafranov equation for the MHD equilibrium of a Tokamak-like plasma with pressure and current profiles specified by analytic forms or sets of data points. Equilibria marginally stable to ballooning modes or with a prescribed fraction of bootstrap current can be computed. The code provides a mapping to magnetic flux coordinates, suitable for MHD stability calculations or global wave propagation studies. The code computes equilibrium quantities for the stability codes ERATO, MARS, PEST, NOVA-W and XTOR and for the global wave propagation codes LION and PENN. The two-dimensional MHD equilibrium (Grad-Shafranov) equation is solved in variational form. The discretization uses bicubic Hermite finite elements with continuous first order derivates for the poloidal flux function Ψ. The nonlinearity of the problem is handled by Picard iteration. The mapping to flux coordinates is carried out with a method which conserves the accuracy of the cubic finite elements. The code uses routines from the CRAY libsci.a program library. However, all these routines are included in the CHEASE package itself. If CHEASE computes equilibrium quantities for MARS with fast Fourier transforms, the NAG library is required. CHEASE is written in standard FORTRAN-77, except for the use of the input facility NAMELIST. CHEASE uses variable names with up to 8 characters, and therefore violates the ANSI standard. CHEASE transfers plot quantities through an external disk file to a plot program named PCHEASE using the UNIRAS or the NCAR plot package. (author) figs., tabs., 34 refs

  19. The CHEASE code for toroidal MHD equilibria

    Energy Technology Data Exchange (ETDEWEB)

    Luetjens, H. [Ecole Polytechnique, 91 - Palaiseau (France). Centre de Physique Theorique; Bondeson, A. [Chalmers Univ. of Technology, Goeteborg (Sweden). Inst. for Electromagnetic Field Theory and Plasma Physics; Sauter, O. [ITER-San Diego, La Jolla, CA (United States)

    1996-03-01

    CHEASE solves the Grad-Shafranov equation for the MHD equilibrium of a Tokamak-like plasma with pressure and current profiles specified by analytic forms or sets of data points. Equilibria marginally stable to ballooning modes or with a prescribed fraction of bootstrap current can be computed. The code provides a mapping to magnetic flux coordinates, suitable for MHD stability calculations or global wave propagation studies. The code computes equilibrium quantities for the stability codes ERATO, MARS, PEST, NOVA-W and XTOR and for the global wave propagation codes LION and PENN. The two-dimensional MHD equilibrium (Grad-Shafranov) equation is solved in variational form. The discretization uses bicubic Hermite finite elements with continuous first order derivates for the poloidal flux function {Psi}. The nonlinearity of the problem is handled by Picard iteration. The mapping to flux coordinates is carried out with a method which conserves the accuracy of the cubic finite elements. The code uses routines from the CRAY libsci.a program library. However, all these routines are included in the CHEASE package itself. If CHEASE computes equilibrium quantities for MARS with fast Fourier transforms, the NAG library is required. CHEASE is written in standard FORTRAN-77, except for the use of the input facility NAMELIST. CHEASE uses variable names with up to 8 characters, and therefore violates the ANSI standard. CHEASE transfers plot quantities through an external disk file to a plot program named PCHEASE using the UNIRAS or the NCAR plot package. (author) figs., tabs., 34 refs.

  20. Development of a parallelization strategy for the VARIANT code

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Khalil, H.S.; Palmiotti, G.; Tatsumi, M.

    1996-01-01

    The VARIANT code solves the multigroup steady-state neutron diffusion and transport equation in three-dimensional Cartesian and hexagonal geometries using the variational nodal method. VARIANT consists of four major parts that must be executed sequentially: input handling, calculation of response matrices, solution algorithm (i.e. inner-outer iteration), and output of results. The objective of the parallelization effort was to reduce the overall computing time by distributing the work of the two computationally intensive (sequential) tasks, the coupling coefficient calculation and the iterative solver, equally among a group of processors. This report describes the code's calculations and gives performance results on one of the benchmark problems used to test the code. The performance analysis in the IBM SPx system shows good efficiency for well-load-balanced programs. Even for relatively small problem sizes, respectable efficiencies are seen for the SPx. An extension to achieve a higher degree of parallelism will be addressed in future work. 7 refs., 1 tab