Ternary Tree and Clustering Based Huffman Coding Algorithm
Directory of Open Access Journals (Sweden)
Pushpa R. Suri
2010-09-01
Full Text Available In this study, the focus was on the use of ternary tree over binary tree. Here, a new two pass Algorithm for encoding Huffman ternary tree codes was implemented. In this algorithm we tried to find out the codeword length of the symbol. Here I used the concept of Huffman encoding. Huffman encoding was a two pass problem. Here the first pass was to collect the letter frequencies. You need to use that information to create the Huffman tree. Note that char values range from -128 to 127, so you will need to cast them. I stored the data as unsigned chars to solve this problem, and then the range is 0 to 255. Open the output file and write the frequency table to it. Open the input file, read characters from it, gets the codes, and writes the encoding into the output file. Once a Huffman code has been generated, data may be encoded simply by replacing each symbol with its code. To reduce the memory size and fasten the process of finding the codeword length for a symbol in a Huffman tree, we proposed a memory efficient data structure to represent the codeword length of Huffman ternary tree. In this algorithm we tried to find out the length of the code of the symbols used in the tree.
The number of Huffman codes, compact trees, and sums of unit fractions
Elsholtz, Christian; Prodinger, Helmut
2011-01-01
The number of "nonequivalent" Huffman codes of length r over an alphabet of size t has been studied frequently. Equivalently, the number of "nonequivalent" complete t-ary trees has been examined. We first survey the literature, unifying several independent approaches to the problem. Then, improving on earlier work we prove a very precise asymptotic result on the counting function, consisting of two main terms and an error term.
Hybrid threshold adaptable quantum secret sharing scheme with reverse Huffman-Fibonacci-tree coding
Lai, Hong; Zhang, Jun; Luo, Ming-Xing; Pan, Lei; Pieprzyk, Josef; Xiao, Fuyuan; Orgun, Mehmet A.
2016-08-01
With prevalent attacks in communication, sharing a secret between communicating parties is an ongoing challenge. Moreover, it is important to integrate quantum solutions with classical secret sharing schemes with low computational cost for the real world use. This paper proposes a novel hybrid threshold adaptable quantum secret sharing scheme, using an m-bonacci orbital angular momentum (OAM) pump, Lagrange interpolation polynomials, and reverse Huffman-Fibonacci-tree coding. To be exact, we employ entangled states prepared by m-bonacci sequences to detect eavesdropping. Meanwhile, we encode m-bonacci sequences in Lagrange interpolation polynomials to generate the shares of a secret with reverse Huffman-Fibonacci-tree coding. The advantages of the proposed scheme is that it can detect eavesdropping without joint quantum operations, and permits secret sharing for an arbitrary but no less than threshold-value number of classical participants with much lower bandwidth. Also, in comparison with existing quantum secret sharing schemes, it still works when there are dynamic changes, such as the unavailability of some quantum channel, the arrival of new participants and the departure of participants. Finally, we provide security analysis of the new hybrid quantum secret sharing scheme and discuss its useful features for modern applications.
改进的赫夫曼树(Huffman Tree)和赫夫曼编码(Huffman Code)构造算法
Institute of Scientific and Technical Information of China (English)
刘帮涛; 罗敏
2008-01-01
通过将待排序的数据应用快速排序算法进行排序处理,使得赫夫曼算法(Huffman Algorithm)的时同复杂度从O(n2降低为O(n*log2n).当用于构造赫夫曼树(Huffman Tree)的结点比较多时,可较大的提高程序的运行时间.
Bounds on Generalized Huffman Codes
Baer, Michael B
2007-01-01
New lower and upper bounds are obtained for the compression of optimal binary prefix codes according to various nonlinear codeword length objectives. Like the coding bounds for Huffman coding - which concern the traditional linear code objective of minimizing average codeword length -- these are in terms of a form of entropy and the probability of the most probable input symbol. As in Huffman coding, some upper bounds can be found using sufficient conditions for the codeword corresponding to the most probable symbol being one bit long. Whereas having probability no less than 0.4 is a tight sufficient condition for this to be the case in Huffman coding, other penalties differ, some having a tighter condition, some a looser condition, and others having no such sufficient condition. The objectives explored here are ones for which optimal codes can be found using a generalized form of Huffman coding. These objectives include one related to queueing (an increasing exponential average), one related to single-shot c...
Ternary Tree and Memory-Efficient Huffman Decoding Algorithm
Directory of Open Access Journals (Sweden)
Pushpa R. Suri
2011-01-01
Full Text Available In this study, the focus was on the use of ternary tree over binary tree. Here, a new one pass Algorithm for Decoding adaptive Huffman ternary tree codes was implemented. To reduce the memory size and fasten the process of searching for a symbol in a Huffman tree, we exploited the property of the encoded symbols and proposed a memory efficient data structure to represent the codeword length of Huffman ternary tree. In this algorithm we tried to find out the staring and ending address of the code to know the length of the code. And then in second algorithm we tried to decode the ternary tree code using binary search method. In this algorithm we tried to find out the staring and ending address of the code to know the length of the code. And then in second algorithm we tried to decode the ternary tree code using binary search method.
Maximal codeword lengths in Huffman codes
Abu-Mostafa, Y. S.; Mceliece, R. J.
1992-01-01
The following question about Huffman coding, which is an important technique for compressing data from a discrete source, is considered. If p is the smallest source probability, how long, in terms of p, can the longest Huffman codeword be? It is shown that if p is in the range 0 less than p less than or equal to 1/2, and if K is the unique index such that 1/F(sub K+3) less than p less than or equal to 1/F(sub K+2), where F(sub K) denotes the Kth Fibonacci number, then the longest Huffman codeword for a source whose least probability is p is at most K, and no better bound is possible. Asymptotically, this implies the surprising fact that for small values of p, a Huffman code's longest codeword can be as much as 44 percent larger than that of the corresponding Shannon code.
A quantum analog of Huffman coding
Braunstein, S L; Gottesman, D; Lo, H K; Braunstein, Samuel L.; Fuchs, Christopher A.; Gottesman, Daniel; Lo, Hoi-Kwong
1998-01-01
We analyse a generalization of Huffman coding to the quantum case. In particular, we notice various difficulties in using instantaneous codes for quantum communication. However, for the storage of quantum information, we have succeeded in constructing a Huffman-coding inspired quantum scheme. The number of computational steps in the encoding and decoding processes of N quantum signals can be made to be polynomial in log N by a massively parallel implementation of a quantum gate array. This is to be compared with the N^3 computational steps required in the sequential implementation by Cleve and DiVincenzo of the well-known quantum noiseless block coding scheme by Schumacher. The powers and limitations in using this scheme in communication are also discussed.
Canonical Huffman code based full-text index
Institute of Scientific and Technical Information of China (English)
Yi Zhang; Zhili Pei; Jinhui Yang; Yanchun Liang
2008-01-01
Full-text indices are data structures that can be used to find any substring of a given string. Many full-text indices require space larger than the original string. In this paper, we introduce the canonical Huffman code to the wavelet tree of a string T[1...n]. Compared with Huffman code based wavelet tree, the memory space used to represent the shape of wavelet tree is not needed. In case of large alphabet, this part of memory is not negligible. The operations of wavelet tree are also simpler and more efficient due to the canonical Huffman code. Based on the resulting structure, the multi-key rank and select functions can be performed using at most nH0 + |X|(lglgn + lgn - lg|Σ|)+O(nH0) bits and in O(H0) time for average cases, where H0 is the zeroth order empirical entropy of T. In the end, we present an efficient construction algorithm for this index, which is on-line and linear.
Estimating the size of Huffman code preambles
Mceliece, R. J.; Palmatier, T. H.
1993-01-01
Data compression via block-adaptive Huffman coding is considered. The compressor consecutively processes blocks of N data symbols, estimates source statistics by computing the relative frequencies of each source symbol in the block, and then synthesizes a Huffman code based on these estimates. In order to let the decompressor know which Huffman code is being used, the compressor must begin the transmission of each compressed block with a short preamble or header file. This file is an encoding of the list n = (n(sub 1), n(sub 2)....,n(sub m)), where n(sub i) is the length of the Hufffman codeword associated with the ith source symbol. A simple method of doing this encoding is to individually encode each n(sub i) into a fixed-length binary word of length log(sub 2)l, where l is an a priori upper bound on the codeword length. This method produces a maximum preamble length of mlog(sub 2)l bits. The object is to show that, in most cases, no substantially shorter header of any kind is possible.
Joint compression and encryption using chaotically mutated Huffman trees
Hermassi, Houcemeddine; Rhouma, Rhouma; Belghith, Safya
2010-10-01
This paper introduces a new scheme for joint compression and encryption using the Huffman codec. A basic tree is first generated for a given message and then based on a keystream generated from a chaotic map and depending from the input message, the basic tree is mutated without changing the statistical model. Hence a symbol can be coded by more than one codeword having the same length. The security of the scheme is tested against the known plaintext attack and the brute force attack. Performance analysis including encryption/decryption speed, additional computational complexity and compression ratio are given.
Sequential adaptive compressed sampling via Huffman codes
Aldroubi, Akram; Zarringhalam, Kourosh
2008-01-01
There are two main approaches in compressed sensing: the geometric approach and the combinatorial approach. In this paper we introduce an information theoretic approach and use results from the theory of Huffman codes to construct a sequence of binary sampling vectors to determine a sparse signal. Unlike other approaches, our approach is adaptive in the sense that each sampling vector depends on the previous sample. The number of measurements we need for a k-sparse vector in n-dimensional space is no more than O(k log n) and the reconstruction is O(k).
AN APPLICATION OF PLANAR BINARY BITREES TO PREFIX AND HUFFMAN PREFIX CODE
Erjavec, Zlatko
2004-01-01
In this paper we construct prefix code in which the use of planar binary trees is replaced by the use of the planar binary bitrees. In addition, we apply the planar binary bitrees to the Huffman prefix code. Finally, we code English alphabet in such a way that characters have codewords different from already established ones.
Difference-Huffman Coding of Multidimensional Databases
Szépkúti, István
2011-01-01
A new compression method called difference-Huffman coding (DHC) is introduced in this paper. It is verified empirically that DHC results in a smaller multidimensional physical representation than those for other previously published techniques (single count header compression, logical position compression, base-offset compression and difference sequence compression). The article examines how caching influences the expected retrieval time of the multidimensional and table representations of relations. A model is proposed for this, which is then verified with empirical data. Conclusions are drawn, based on the model and the experiment, about when one physical representation outperforms another in terms of retrieval time. Over the tested range of available memory, the performance for the multidimensional representation was always much quicker than for the table representation.
Entropy-Based Bounds On Redundancies Of Huffman Codes
Smyth, Padhraic J.
1992-01-01
Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.
Short Huffman Codes Producing 1s Half of the Time
Altenbach, Fabian; Mathar, Rudolf
2011-01-01
The design of the channel part of a digital communication system (e.g., error correction, modulation) is heavily based on the assumption that the data to be transmitted forms a fair bit stream. However, simple source encoders such as short Huffman codes generate bit streams that poorly match this assumption. As a result, the channel input distribution does not match the original design criteria. In this work, a simple method called half Huffman coding (halfHc) is developed. halfHc transforms a Huffman code into a source code whose output is more similar to a fair bit stream. This is achieved by permuting the codewords such that the frequency of 1s at the output is close to 0.5. The permutations are such that the optimality in terms of achieved compression ratio is preserved. halfHc is applied in a practical example, and the resulting overall system performs better than when conventional Huffman coding is used.
广义哈夫曼树及其在汉字编码中的应用%GENERALIZED HUFFMAN TREE AND ITS APPLICATION IN CHINESE CHARACTER cODING
Institute of Scientific and Technical Information of China (English)
游洪跃; 汪建武; 陶郁
2000-01-01
提出了广义哈夫曼树的概念，证明了有关的定理和结论，构造了广义哈夫曼树的算法，最后在汉字编码方面进行了应用.%Authors present a concept-generalized Huffman tree(GHT) and prove some pertinent theorems. Meanwhile, authors design the GHTs algorithm. In particular, authors give its application in Chinese character coding.
Loss less DNA Solidity Using Huffman and Arithmetic Coding
Directory of Open Access Journals (Sweden)
Lakshmi Mythri Dasari
2014-07-01
Full Text Available DNA Sequences making up any bacterium comprise the blue print of that bacterium so that understanding and analyzing different genes with in sequences has become an exceptionally significant mission. Naturalists are manufacturing huge volumes of DNA Sequences every day that makes genome sequence catalogue emergent exponentially. The data bases such as Gen-bank represents millions of DNA Sequences filling many thousands of gigabytes workstation storing capability. Solidity of Genomic sequences can decrease the storage requirements, and increase the broadcast speed. In this paper we compare two lossless solidity algorithms (Huffman and Arithmetic coding. In Huffman coding, individual bases are coded and assigned a specific binary number. But for Arithmetic coding entire DNA is coded in to a single fraction number and binary word is coded to it. Solidity ratio is compared for both the methods and finally we conclude that arithmetic coding is the best.
M-ary Anti - Uniform Huffman Codes for Infinite Sources With Geometric Distribution
Tarniceriu, Daniela; Munteanu, Valeriu; Zaharia, Gheorghe,
2013-01-01
International audience; In this paper we consider the class of generalized antiuniform Huffman (AUH) codes for sources with infinite alphabet and geometric distribution. This distribution leads to infinite anti- uniform sources for some ranges of its parameters. Huffman coding of these sources results in AUH codes. We perform a generalization of binary Huffman encoding, using a M-letter code alphabet and prove that as a result of this encoding, sources with memory are obtained. For these sour...
Huffman-based code compression techniques for embedded processors
Bonny, Mohamed Talal
2010-09-01
The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures
A Dynamic Programming Approach To Length-Limited Huffman Coding
Golin, Mordecai
2008-01-01
The ``state-of-the-art'' in Length Limited Huffman Coding algorithms is the $\\Theta(ND)$-time, $\\Theta(N)$-space one of Hirschberg and Larmore, where $D\\le N$ is the length restriction on the code. This is a very clever, very problem specific, technique. In this note we show that there is a simple Dynamic-Programming (DP) method that solves the problem with the same time and space bounds. The fact that there was an $\\Theta(ND)$ time DP algorithm was previously known; it is a straightforward DP with the Monge property (which permits an order of magnitude speedup). It was not interesting, though, because it also required $\\Theta(ND)$ space. The main result of this paper is the technique developed for reducing the space. It is quite simple and applicable to many other problems modeled by DPs with the Monge property. We illustrate this with examples from web-proxy design and wireless mobile paging.
On constructing symmetrical reversible variable-length codes independent of the Huffman code
Institute of Scientific and Technical Information of China (English)
HUO Jun-yan; CHANG Yi-lin; MA Lin-hua; LUO Zhong
2006-01-01
Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is independent of the Huffman code. The proposed algorithm's codeword assignment is only based on symbol occurrence probability. It has many advantages over symmetrical construction algorithms available for easy realization and better code performance. In addition, the proposed algorithm simplifies the codeword selection mechanism dramatically.
JOINT SOURCE-CHANNEL DECODING OF HUFFMAN CODES WITH LDPC CODES
Institute of Scientific and Technical Information of China (English)
Mei Zhonghui; Wu Lenan
2006-01-01
In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redundancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).
Hakim, P. R.; Permala, R.
2017-01-01
LAPAN-A3/IPB satellite is the latest Indonesian experimental microsatellite with remote sensing and earth surveillance missions. The satellite has three optical payloads, which are multispectral push-broom imager, digital matrix camera and video camera. To increase data transmission efficiency, the multispectral imager data can be compressed using either lossy or lossless compression method. This paper aims to analyze Differential Pulse Code Modulation (DPCM) method and Huffman coding that are used in LAPAN-IPB satellite image lossless compression. Based on several simulation and analysis that have been done, current LAPAN-IPB lossless compression algorithm has moderate performance. There are several aspects that can be improved from current configuration, which are the type of DPCM code used, the type of Huffman entropy-coding scheme, and the use of sub-image compression method. The key result of this research shows that at least two neighboring pixels should be used for DPCM calculation to increase compression performance. Meanwhile, varying Huffman tables with sub-image approach could also increase the performance if on-board computer can support for more complicated algorithm. These results can be used as references in designing Payload Data Handling System (PDHS) for an upcoming LAPAN-A4 satellite.
Load Balancing Scheme on the Basis of Huffman Coding for P2P Information Retrieval
Kurasawa, Hisashi; Takasu, Atsuhiro; Adachi, Jun
Although a distributed index on a distributed hash table (DHT) enables efficient document query processing in Peer-to-Peer information retrieval (P2P IR), the index costs a lot to construct and it tends to be an unfair management because of the unbalanced term frequency distribution. We devised a new distributed index, named Huffman-DHT, for P2P IR. The new index uses an algorithm similar to Huffman coding with a modification to the DHT structure based on the term distribution. In a Huffman-DHT, a frequent term is assigned to a short ID and allocated a large space in the node ID space in DHT. Throuth ID management, the Huffman-DHT balances the index registration accesses among peers and reduces load concentrations. Huffman-DHT is the first approach to adapt concepts of coding theory and term frequency distribution to load balancing. We evaluated this approach in experiments using a document collection and assessed its load balancing capabilities in P2P IR. The experimental results indicated that it is most effective when the P2P system consists of about 30, 000 nodes and contains many documents. Moreover, we proved that we can construct a Huffman-DHT easily by estimating the probability distribution of the term occurrence from a small number of sample documents.
An Upper Limit of AC Huffman Code Length in JPEG Compression
Horie, Kenichi
2009-01-01
A strategy for computing upper code-length limits of AC Huffman codes for an 8x8 block in JPEG Baseline coding is developed. The method is based on a geometric interpretation of the DCT, and the calculated limits are as close as 14% to the maximum code-lengths. The proposed strategy can be adapted to other transform coding methods, e.g., MPEG 2 and 4 video compressions, to calculate close upper code length limits for the respective processing blocks.
Huffman Coding with Letter Costs: A Linear-Time Approximation Scheme
Golin, Mordecai; Mathieu, Claire; Young, Neal E.
2002-01-01
We give a polynomial-time approximation scheme for the generalization of Huffman Coding in which codeword letters have non-uniform costs (as in Morse code, where the dash is twice as long as the dot). The algorithm computes a (1+epsilon)-approximate solution in time O(n + f(epsilon) log^3 n), where n is the input size.
Design and performance of Huffman sequences in medical ultrasound coded excitation.
Polpetta, Alessandro; Banelli, Paolo
2012-04-01
This paper deals with coded-excitation techniques for ultrasound medical echography. Specifically, linear Huffman coding is proposed as an alternative approach to other widely established techniques, such as complementary Golay coding and linear frequency modulation. The code design is guided by an optimization procedure that boosts the signal-to-noise ratio gain (GSNR) and, interestingly, also makes the code robust in pulsed-Doppler applications. The paper capitalizes on a thorough analytical model that can be used to design any linear coded-excitation system. This model highlights that the performance in frequency-dependent attenuating media mostly depends on the pulse-shaping waveform when the codes are characterized by almost ideal (i.e., Kronecker delta) autocorrelation. In this framework, different pulse shapers and different code lengths are considered to identify coded signals that optimize the contrast resolution at the output of the receiver pulse compression. Computer simulations confirm that the proposed Huffman codes are particularly effective, and that there are scenarios in which they may be preferable to the other established approaches, both in attenuating and non-attenuating media. Specifically, for a single scatterer at 150 mm in a 0.7-dB/(MHz·cm) attenuating medium, the proposed Huffman design achieves a main-to-side lobe ratio (MSR) equal to 65 dB, whereas tapered linear frequency modulation and classical complementary Golay codes achieve 35 and 45 dB, respectively.
Analysis and Research on Adaptive Huffman Coding%自适应Huffman编码算法分析及研究
Institute of Scientific and Technical Information of China (English)
彭文艺
2012-01-01
Huffman编码作为一种高效而简单的可变长编码常用于信源编码.但现有的Huffman编码算法存在效率不高,同时应用受到一些限制,因此,提出一种自适应Huffman编码算法,该算法与其他的Huffman编码相比效率更高,应用范围更广.%Huffman coding, as an efficient and simple variable length coding, is used in source coding. But the existing Huffman coding algorithm efficiency is not high, but application is also limited, therefore, this paper proposes an adaptive Huffman coding algorithm, this algorithm with other Huffman encoding compared with high efficiency, wide application range.
Tight Bounds on the Average Length, Entropy, and Redundancy of Anti-Uniform Huffman Codes
Mohajer, Soheil
2007-01-01
In this paper we consider the class of anti-uniform Huffman codes and derive tight lower and upper bounds on the average length, entropy, and redundancy of such codes in terms of the alphabet size of the source. The Fibonacci distributions are introduced which play a fundamental role in AUH codes. It is shown that such distributions maximize the average length and the entropy of the code for a given alphabet size. Another previously known bound on the entropy for given average length follows immediately from our results.
Research of Data Compression Method Based on the Improved Huffman Code Algorithm%基于改进哈夫曼编码的数据压缩方法研究
Institute of Scientific and Technical Information of China (English)
张红军; 徐超
2014-01-01
As a non-losing compressing coding algorithm, Huffman coding has many important application to the current data compression field.The classic algorithm to get Huffman coding is from bottom to top on the basis of the Huffman tree. This paper gives an improved Huffman algorithm of data compression by the analysis of the Huffman algorithm, in which algorithm go from the root node to leaf nodes of the Huffman tree by using the queue structure.In the coding process, every leaf node is only scanned once before getting the Huffman coding.The experimental result shows the fact that the improved algorithm not only the compression ratio is higher than classic algorithm, but also ensure the security and confidentiality of the resulting compressed.%作为一种无损压缩编码方法，哈夫曼编码在数据压缩中具有重要的应用。经典的哈夫曼编码是在构造哈夫曼的基础上自下而上进行的，通过分析哈夫曼算法的思想，给出了一种改进的哈夫曼数据压缩算法。该算法利用队列结构，从哈夫曼的根节点出发，向叶子节点进行编码，在编码过程中仅将哈夫曼树的每个叶子节点进行一次扫描便可以得到各个叶子节点的哈夫曼编码。实验表明，改进算法不仅压缩率高于以往算法，而且保证了最终生成的压缩文件的安全性。
一种改进的Huffman编码技术增加QR码的信息容量%An lmproved Huffman Coding to lncrease lnformation Capacity in QR Code
Institute of Scientific and Technical Information of China (English)
邹敏; 张瑞林; 吴桐树; 王啸
2015-01-01
QR码用于存储信息，很容易受存储容量的限制。针对QR码存储容量较低的缺点，提出了一种改进的Huffman编码来扩大QR码的信息容量。首先，对编码数据采用希尔排序，构造Huffman树得到Huffman编码，并将编码后的数据进行QR的编码，从而得到数据压缩后的QR码。然后，对QR码扫描译码时，利用Huffman树的编码性质对QR码译码后的数据进行解码，从而得到被压缩编码后的原始数据。实验结果表明：该算法能够增加QR码的信息存储容量。%QR codes are used to store information,easily restricted by storage capacity.According to the QR code storage capacity is relatively low.This paper presents a kind of expanding the information capacity of QR codes by using improved Huffman code.First,for the coding data using the shel sort structure Huffman tree to generate Huffman code,coding them to QR code,then get the data of compressed QR code.Secondary,When scanning and decoding of QR code.With the property of the Huffman tree to decode the QR codes,thus get the raw data compressed and encoded.The experimental results show that the algorithm can increase the information storage capacity of QR codes.
Chouakri, S. A.; Djaafri, O.; Taleb-Ahmed, A.
2013-08-01
We present in this work an algorithm for electrocardiogram (ECG) signal compression aimed to its transmission via telecommunication channel. Basically, the proposed ECG compression algorithm is articulated on the use of wavelet transform, leading to low/high frequency components separation, high order statistics based thresholding, using level adjusted kurtosis value, to denoise the ECG signal, and next a linear predictive coding filter is applied to the wavelet coefficients producing a lower variance signal. This latter one will be coded using the Huffman encoding yielding an optimal coding length in terms of average value of bits per sample. At the receiver end point, with the assumption of an ideal communication channel, the inverse processes are carried out namely the Huffman decoding, inverse linear predictive coding filter and inverse discrete wavelet transform leading to the estimated version of the ECG signal. The proposed ECG compression algorithm is tested upon a set of ECG records extracted from the MIT-BIH Arrhythmia Data Base including different cardiac anomalies as well as the normal ECG signal. The obtained results are evaluated in terms of compression ratio and mean square error which are, respectively, around 1:8 and 7%. Besides the numerical evaluation, the visual perception demonstrates the high quality of ECG signal restitution where the different ECG waves are recovered correctly.
Writing on the Facade of RWTH ICT Cubes: Cost Constrained Geometric Huffman Coding
Böcherer, Georg; Malsbender, Martina; Mathar, Rudolf
2011-01-01
In this work, a coding technique called cost constrained Geometric Huffman coding (ccGhc) is developed. ccGhc minimizes the Kullback-Leibler distance between a dyadic probability mass function (pmf) and a target pmf subject to an affine inequality constraint. An analytical proof is given that when ccGhc is applied to blocks of symbols, the optimum is asymptotically achieved when the blocklength goes to infinity. The derivation of ccGhc is motivated by the problem of encoding a text to a sequence of slats subject to architectural design criteria. For the considered architectural problem, for a blocklength of 3, the codes found by ccGhc match the design criteria. For communications channels with average cost constraints, ccGhc can be used to efficiently find prefix-free modulation codes that are provably capacity achieving.
HUFFMAN-BASED GROUP KEY ESTABLISHMENT SCHEME WITH LOCATION-AWARE
Institute of Scientific and Technical Information of China (English)
Gu Xiaozhuo; Yang Jianzu; Lan Julong
2009-01-01
Time efficiency of key establishment and update is one of the major problems contributory key managements strive to address. To achieve better time efficiency in key establishment, we propose a Location-based Huffman (L-Huffman) scheme. First, users are separated into several small groups to minimize communication cost when they are distributed over large networks. Second, both user's computation difference and message transmission delay are taken into consideration when Huffman coding is employed to forming the optimal key tree. Third, the combined weights in Huffman tree are located in a higher place of the key tree to reduce the variance of the average key generation time and minimize the longest key generation time. Simulations demonstrate that L-Huffman has much better performance in wide area networks and is a little better in local area network than Huffman scheme.
Efficient Data Compression Scheme using Dynamic Huffman Code Applied on Arabic Language
Directory of Open Access Journals (Sweden)
Sameh Ghwanmeh
2006-01-01
Full Text Available The development of an efficient compression scheme to process the Arabic language represents a difficult task. This paper employs the dynamic Huffman coding on data compression with variable length bit coding, on the Arabic language. Experimental tests have been performed on both Arabic and English text. A comparison was made to measure the efficiency of compressing data results on both Arabic and English text. Also a comparison was made between the compression rate and the size of the file to be compressed. It has been found that as the file size increases, the compression ratio decreases for both Arabic and English text. The experimental results show that the average message length and the efficiency of compression on Arabic text was better than the compression on English text. Also, results show that the main factor which significantly affects compression ratio and average message length was the frequency of the symbols on the text.
Gutman, Igor; Wulich, Dov
2009-01-01
Multiple input multiple output (MIMO) precoding is an efficient scheme that may significantly enhance the communication link. However, this enhancement comes with a cost. Many precoding schemes require channel knowledge at the transmitter that is obtained through feedback from the receiver. Focusing on the natural common fusion of orthogonal frequency division multiplexing (OFDM) and MIMO, we exploit the channel correlation in the frequency and spatial domain to reduce the required feedback rate in a frequency division duplex (FDD) system. The proposed feedback method is based on Huffman coding and is employed here for the single stream case. The method leads to a significant reduction in the required feedback rate, without any loss in performance. The proposed method may be extended to the multi-stream case.
Raeiatibanadkooki, Mahsa; Quchani, Saeed Rahati; KhalilZade, MohammadMahdi; Bahaadinbeigy, Kambiz
2016-03-01
In mobile health care monitoring, compression is an essential tool for solving storage and transmission problems. The important issue is able to recover the original signal from the compressed signal. The main purpose of this paper is compressing the ECG signal with no loss of essential data and also encrypting the signal to keep it confidential from everyone, except for physicians. In this paper, mobile processors are used and there is no need for any computers to serve this purpose. After initial preprocessing such as removal of the baseline noise, Gaussian noise, peak detection and determination of heart rate, the ECG signal is compressed. In compression stage, after 3 steps of wavelet transform (db04), thresholding techniques are used. Then, Huffman coding with chaos for compression and encryption of the ECG signal are used. The compression rates of proposed algorithm is 97.72 %. Then, the ECG signals are sent to a telemedicine center to acquire specialist diagnosis by TCP/IP protocol.
Jin, Xin; Nie, Rencan; Zhou, Dongming; Yao, Shaowen; Chen, Yanyan; Yu, Jiefu; Wang, Quan
2016-11-01
A novel method for the calculation of DNA sequence similarity is proposed based on simplified pulse-coupled neural network (S-PCNN) and Huffman coding. In this study, we propose a coding method based on Huffman coding, where the triplet code was used as a code bit to transform DNA sequence into numerical sequence. The proposed method uses the firing characters of S-PCNN neurons in DNA sequence to extract features. Besides, the proposed method can deal with different lengths of DNA sequences. First, according to the characteristics of S-PCNN and the DNA primary sequence, the latter is encoded using Huffman coding method, and then using the former, the oscillation time sequence (OTS) of the encoded DNA sequence is extracted. Simultaneously, relevant features are obtained, and finally the similarities or dissimilarities of the DNA sequences are determined by Euclidean distance. In order to verify the accuracy of this method, different data sets were used for testing. The experimental results show that the proposed method is effective.
Institute of Scientific and Technical Information of China (English)
卢冰; 刘兴海
2013-01-01
通过分析哈夫曼算法的思想，提出了一种改进的哈夫曼数据压缩算法。针对经典哈夫曼算法的不足，采用堆排序的思想构建哈夫曼树并得到哈夫曼编码，这种方法可以减少内存的读写次数，提高系统的响应时间。通过二次映射，把编码文件中每8位二进制转换成一个对应字符，提高了文件的压缩率，保证了最终生成的压缩文件的安全保密性。本文最后采用3个文本文件对改进的哈夫曼算法进行了压缩测试，实验表明，改进的算法，在压缩率上略强于经典算法。% Through the analysis of the Huffman algorithm, an improved Huffman algorithm of data compression is pro-posed. According to the classic Huffman algorithm, using the heap sort thought to build the Huffman tree and the Huff-man coding, this method can reduce the memory read and write times,improving the system response time. Through the second mapping, each 8 encoded file binary is converted into a corresponding character, improve the compression ratio of files and ensure the security and confidentiality of the resulting compressed file. Finally, three text files compression test on the improved Huffman algorithm, experiments show that the improved algorithm, the compression ratio is slightly bet-ter than classic algorithm.
Coded Splitting Tree Protocols
DEFF Research Database (Denmark)
Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar
2013-01-01
This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...... as possible. Evaluations show that the proposed protocol provides considerable gains over the standard tree splitting protocol applying SIC. The improvement comes at the expense of an increased feedback and receiver complexity....
基于四叉树的嵌入式平台Huffman解码优化%Embedded Platform Huffman Optimization Decoding Algorithm Base on Quad-Tree
Institute of Scientific and Technical Information of China (English)
鲁云飞; 何明华
2012-01-01
考虑到嵌入式设备资源的有限性,提出一种基于四叉树的Huffman解码优化算法.解码过程中,先将Huffman码表表示成四叉树结构,据此重建为一维数组,并充分利用数值计算代替判断与跳转操作.为测试本算法解码性能,将其应用于嵌入式MP3实时解码中,结果表明本算法内存损耗小,解码速率快,算法复杂度低,相比于其他优化算法,更适合应用于嵌入式设备中.%Considering the limitation of embedded system resources, a Huffman decoding optimization algorithm based on the quad tree is proposed in this paper. In this process, the Huffman code table is expressed as quad tree structure at first, and according to which a one-dimensional array is reconstructed, then make full use of numerical calculation instead of judgment and jump operation. In order to test the decoding performance, the method is applied to the embedded'realtime MP3 decoding. The results show that the algorithm memory loss is small, decoding speed is rapid and its complexity is low, compared to other optimization algorithms, this algorithm is more suitable for application in embedded devices.
Institute of Scientific and Technical Information of China (English)
龙敏; 谭丽
2011-01-01
Using chaos-based weight variation of Huffman tree,an image/video encryption algorithm is proposed in this paper. In the process of the entropy coding,DC coefficients are encrypted by the weight variation of Huffman tree with the double Logistic chaos and AC coefficients are encrypted by the indexes of codeword. The security,complexity and compression ration of the algorithm are analyzed. Simulation results show that this algorithm has no impact on the compression efficiency and has low complexity,high security and good real-time property. Therefore,it is suitable for real-time image on the network.%提出一种采用混沌权值变异的Huff man树的图像加密算法.此算法在熵编码过程中,以基本的Huffman树为标准,利用双耦合混沌序列1对DC系数进行树的结构未变异、路径值变异的加密；再利用双耦合混沌序列2对AC系数进行码字序号的加密.论文对算法进行了仿真,并对安全性、计算复杂度、压缩比性能进行了分析.实验结果表明,该算法基本上不影响压缩效率,且计算复杂度低、安全性高和实时性好,可用于网络上的图像服务.
Huffman编码在矢量地图压缩中的应用%Huffman Coding and Applications in Compression for Vector Maps
Institute of Scientific and Technical Information of China (English)
刘兴科; 陈轲; 于晓光
2014-01-01
Huffman 编码是一种统计编码，是数据无损压缩中的重要方法。本文研究了Huffman编码的原理及其实现，并将其应用于矢量地图数据的压缩。针对矢量地图数据的特点，提出了Huffman编码的具体算法及压缩与解压缩的实现步骤，讨论了算法用于压缩矢量地图的优良性质。通过试验展示了Huffman编码进行数据压缩的原理与实现过程，并利用一组真实的矢量地图数据验证了所提出的算法可以有效实现对矢量地图数据的压缩，具有无损、高效、压缩率高、通用性好的优点。%Huffman coding is a statistical coding method and widely used in lossless compression. The principal and implementation of Huffman coding was studied and the compression of vector maps was implemented with Huffman coding. Considering the characteristics of the vector maps the detailed algorithm of Huffman coding and the steps of compression and decompression was proposed and the property of the algorithm in vector map compression was discussed. The principle and process of Huffman coding was shown with an experiment. It is demonstrated with ex-periments using a set of real vector maps that the proposed algorithm was a lossless compression method with high efficiency high compression ratio and perfect generality.
Research on Packet Marking Algorithm Based on Huffman Code%基于 Huffman 编码的包标记算法研究
Institute of Scientific and Technical Information of China (English)
李明珍; 覃运初; 唐凤仙
2015-01-01
防范DDoS攻击的关键在于攻击源的定位，包标记是攻击源定位技术研究的热点。针对传统概率包标记存在的问题，提出选择IPv4数据报首部的选项字段作为标记区域，采用Huffman编码压缩标记信息，减少路径重构时所需标记包的数量；利用IPv6的隧道模式，在IPv4到IPv6网络时增加一个复制操作，将标记信息转存到IPv6的hop－by－hop字段，增加改进算法的适用范围。实验结果表明，改进算法快速、准确和高效，只需一个数据报即可完成路径重构，适用于IPv4和IPv6网络。%The key to prevent DDoS attacks is locating attack source , and packet marking is the hot spot of attack source locating technology .Aiming at the problems of packet marking , an improved algorithm is proposed . The improved algorithm chooses option field of IPv 4 datagram header as the marking area and uses Huffman code to reduce the number of marked packets during path reconstruction .Packets pass from IPv4 network to IPv6 network, adding a copy operation to copy marking information to IPv 6 extension header of hop -by-hop.Thus, it increases the application scope .The experimental results show that the improved algorithm is rapid , accurate and efficient .It can complete path reconstruction only needing a datagram , which can be applied to IPv 4 and IPv6 network .
Datta, Jinia; Chowdhuri, Sumana; Bera, Jitendranath
2016-12-01
This paper presents a novel scheme of remote condition monitoring of multi machine system where a secured and coded data of induction machine with different parameters is communicated between a state-of-the-art dedicated hardware Units (DHU) installed at the machine terminal and a centralized PC based machine data management (MDM) software. The DHUs are built for acquisition of different parameters from the respective machines, and hence are placed at their nearby panels in order to acquire different parameters cost effectively during their running condition. The MDM software collects these data through a communication channel where all the DHUs are networked using RS485 protocol. Before transmitting, the parameter's related data is modified with the adoption of differential pulse coded modulation (DPCM) and Huffman coding technique. It is further encrypted with a private key where different keys are used for different DHUs. In this way a data security scheme is adopted during its passage through the communication channel in order to avoid any third party attack into the channel. The hybrid mode of DPCM and Huffman coding is chosen to reduce the data packet length. A MATLAB based simulation and its practical implementation using DHUs at three machine terminals (one healthy three phase, one healthy single phase and one faulty three phase machine) proves its efficacy and usefulness for condition based maintenance of multi machine system. The data at the central control room are decrypted and decoded using MDM software. In this work it is observed that Chanel efficiency with respect to different parameter measurements has been increased very much.
Khina, Anatoly
2016-08-15
We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.
哈夫曼算法及其应用研究%Research and Application of Huffman Algorithm
Institute of Scientific and Technical Information of China (English)
张荣梅
2013-01-01
The Huffman algorithm firstly is analyzed in this paper. Then, a implementation method of the Huffman algorithm is giv?en. Next, the applications of the Huffman algorithm on compression coding, decision tree and optimal merge tree are discussed.%该文首先分析了赫夫曼算法,给出了一种赫夫曼算法的实现方法,然后研究了赫夫曼算法在压缩编码,判定树,在外部文件排序中的最佳归并树等中的应用.
Institute of Scientific and Technical Information of China (English)
杨多星; 刘蕴红
2011-01-01
现在广泛使用的压缩编码方法都要通过哈夫曼树来实现,这样围绕着哈夫曼树就存在着许多运算过程.为了化简编码过程,提出了一种无需哈夫曼树就能实现的变长最佳编码方法,通过一个概率补偿的过程,可以直接得到所有信源的最佳码长.知道码长和概率后也无需通过哈夫曼树就可以确定最后的编码,并且可以证明结果满足变长最佳编码定理和前缀编码.经测试,该方法可以快速有效得到变长最佳编码,并简化了变长编码的运算存储过程.%Nowadays most of the widely used compressions encoding methods are implemented by using the Huffman tree. There are many operational processes around the Huffrnan tree. One variable-length optimal encoding method is proposed to simplify the encoding process. The optimal code length can be get from the probability compensation process. The final coding can be determined without the Huffman tree after knowing the code length and probability, and the result meets the challenge of the variable-length optimal coding theorem and the prefix code. After testing, the method is proved to be able to get the variable-length optimal coding quickly and effectively. The calculation and stored procedures are simplified.
On adaptive Huffman coding based on Look-up table%基于查找表的自适应Huffman编码算法
Institute of Scientific and Technical Information of China (English)
雒莎; 葛海波
2011-01-01
Considering that the existing Huffman coding algorithms are not efficient, an adaptive Huffman coding algorithm based on look-up table is proposed, which encodes the data according as the dynamic ta- bles are changing. By this algorithm, the first character is encoded to the code words of ＂KEY＂ firstly, and then, ＂KEY＂ is moved down until a new character turns up. Compared with others, the proposed algorithm can make Huffman coding run more efficiently.%Huffman压缩编码作为一种高效而简单的可变长编码而被广泛应用于信源编码。但现有的Huffman编码算法普遍存在着效率不高的问题，因此，提出一种自适应查找表Huffman编码算法。该算法对数据进行编码的依据是动态变化的表，对于首次出现的字符使用“KEY”的码字进行编码，将“KEY”下移，等待下一个首次出现的字符。与其他算法相比，改进算法Huffman编码的效率得以提高。
与Huffman码相结合的卷积码软判决译码方案%Soft Decoding Scheme of Convolution Code Combined with Huffman Coding
Institute of Scientific and Technical Information of China (English)
郭东亮; 陈小蔷; 吴乐南
2002-01-01
This paper proposes a modification of the soft output Viterbi decoding algorithm (SOVA) which combines convolution code with Huffman coding. The idea is to extract the bit probability information from the Huffman coding and use it to compute the a priori source information which can be used when the channel environment is bad. The suggested scheme does not require changes on the transmitter side. Compared with separate decoding systems, the gain in signal to noise ratio is about 0.5-1.0 dB with a limited added complexity. Simulation results show that the suggested algorithm is effective.%提出了一种与Huffman码相结合的卷积码软判决译码方案.对卷积码的软判决维特比译码算法进行了改进,由Huffman编码的码字概率计算出比特转移概率,进而得出与维特比译码的支路似然值相对应的信源先验信息,通信系统的编码端不作改动,当由于信道条件恶化等原因造成维特比译码算法的支路量度相差很小而难以进行可靠译码时,将信源先验信息作为支路量度的修正值,以改善译码的性能.与分离的信源、信道译码相比,性能增益约为0.5～1.0?dB,增加的复杂性很小.仿真实验验证了算法的有效性.
DEFF Research Database (Denmark)
Martins, Bo; Forchhammer, Søren
1998-01-01
Presently, sequential tree coders are the best general purpose bilevel image coders and the best coders of halftoned images. The current ISO standard, Joint Bilevel Image Experts Group (JBIG), is a good example. A sequential tree coder encodes the data by feeding estimates of conditional...... probabilities to an arithmetic coder. The conditional probabilities are estimated from co-occurrence statistics of past pixels, the statistics are stored in a tree. By organizing the code length calculations properly, a vast number of possible models (trees) reflecting different pixel orderings can...... is one order of magnitude slower than JBIG, obtains excellent and highly robust compression performance. A multipass free tree coding scheme produces superior compression results for all test images. A multipass free template coding scheme produces significantly better results than JBIG for difficult...
Modified adaptive Huffman coding algorithm for wireless sensor network%无线传感网改进型自适应Huffman编码算法
Institute of Scientific and Technical Information of China (English)
许磊; 李千目; 朱保平
2013-01-01
为压缩传输数据的数据量，提出了一种改进型自适应Huffman编码算法，适用于计算资源受限的无线传感网络节点。选择修剪树自适应Huffman编码算法中提供的来自Porcupines的两组测试数据作为实验数据。在TinyOS提供的TOSSIM上对上述数据进行了模拟测试，算法采用C++语言编程实现。结果显示：与修剪树自适应Huffman编码算法相比较，两者的内存资源使用量相等，但该文算法对两组数据的压缩比分别提高了8%和12%。%To reduce the transmission data,a modified adaptive Huffman coding algorithm is proposed for the wireless sensor network(WSN)nodes with poor computational resources. Two groups of test data of Porcupines of tailoring adaptive Huffman coding algorithm are selected as the experimental data. Simulation tests of the two groups of data are proposed by using TOSSIM provided by TinyOS, and the algorithm is realized by using C++. The results show:compared with the tailoring adaptive Huffman coding algorithm,both have the same amount of memory usage,but the compression ratios of the two groups of data of the algorithm proposed here are increased by 8% and 12% respectively.
Institute of Scientific and Technical Information of China (English)
施鹏; 李敏; 于涛; 赵利强; 王建林
2013-01-01
An XML data compression method based on Huffman coding has been proposed for the problem where the accessing rate of a production process report system for a large data source is not high in a certain bandwidth.A data processing class was constructed for XML documents to get a high rate word units in this algorithm.With the help of Huffman coding to code specific unit words,the coded document was compressed by the LZMA compression algorithm.The problem of needing the assistance of the document type definition and XML parser in the traditional XML data compression algorithm was solved using this algorithm,which resulted in a good compression effect.The Huffman-LZMA compression algorithm was constructed and was applied to the production process report system design.The experimental compression ratio of the report data reached about 88％.The bandwidth and storage space were saved effectively,and the report accessing rate was improved.%针对一定网络带宽下生产过程报表系统对大型数据源访问速率不高的问题,提出了一种基于Huffman编码的XML数据压缩方法.通过构造数据处理类获取XML文档中重复率高的节点单元,采用Huff man编码对节点单元进行编码,将编码后文档利用LZMA算法压缩,构建了Huffman-LZMA压缩算法,并将该压缩算法应用于生产过程报表系统设计.实际应用结果表明,该压缩算法对生产过程报表数据源的压缩率达到约88％,有效的节省了网络带宽和存储空间,提高了报表系统的访问速率.
Implementation of Huffman Decoder on Fpga
Directory of Open Access Journals (Sweden)
Safia Amir Dahri
2016-01-01
Full Text Available Lossless data compression algorithm is most widely used algorithm in data transmission, reception and storage systems in order to increase data rate, speed and save lots of space on storage devices. Now-a-days, different algorithms are implemented in hardware to achieve benefits of hardware realizations. Hardware implementation of algorithms, digital signal processing algorithms and filter realization is done on programmable devices i.e. FPGA. In lossless data compression algorithms, Huffman algorithm is most widely used because of its variable length coding features and many other benefits. Huffman algorithms are used in many applications in software form, e.g. Zip and Unzip, communication, etc. In this paper, Huffman algorithm is implemented on Xilinx Spartan 3E board. This FPGA is programmed by Xilinx tool, Xilinx ISE 8.2i. The program is written in VHDL and text data is decoded by a Huffman algorithm on Hardware board which was previously encoded by Huffman algorithm. In order to visualize the output clearly in waveforms, the same code is simulated on ModelSim v6.4. Huffman decoder is also implemented in the MATLAB for verification of operation. The FPGA is a configurable device which is more efficient in all aspects. Text application, image processing, video streaming and in many other applications Huffman algorithms are implemented.
哈夫曼树Huffer man构成原理应用及其数学证明%Application of Huffman Tree Principle and its Mathematical Proof
Institute of Scientific and Technical Information of China (English)
江忠
2016-01-01
哈夫曼树又名最优二叉树，是一种构造带权路径长度最短的二叉树。所有树的带权路径长度，即是树中所有的叶子结点的权值乘以其到根结点的路径长度（若根root结点为0层，叶结点到根结点的路径长度就是叶结点的层数）。二叉树的带权路径长度可记为WPL值=（W1*L1+W2*L2+W3*L3+…+Wn*Ln），n个权重值Wi（i=1，2，...n）构成一棵拥有n个叶结点的二叉树，其相应的叶结点的路径长度为Li（i=1，2，…，n）。能够证明哈夫曼树的WPL的取值是最小的。%Huffman tree, also known as the optimal binary tree, is a kind of special weighted shortest path length of the binary tree. The tree weighted path length is the right of all the leaves in the tree node value multi-plied by its path length to the root node (if root node is layer 0, path length of leaf nodes to root node is a leaf node layer). Binary tree weighted path length can be written to WPL =(W1*L1+W2*L2+W3*L3+…+Wn* Ln), the weights of N Wi (I = 1,2,…,n) constitute a tree that has n leaf nodes of a binary tree, and its corresponding path length of the leaf nodes is Li (I = 1,2,…,n). This proves that the WPL of Huffman tree is the smallest.
Evaluation of Huffman and Arithmetic Algorithms for Multimedia Compression Standards
Shahbahrami, Asadollah; Rostami, Mobin Sabbaghi; Mobarhan, Mostafa Ayoubi
2011-01-01
Compression is a technique to reduce the quantity of data without excessively reducing the quality of the multimedia data. The transition and storing of compressed multimedia data is much faster and more efficient than original uncompressed multimedia data. There are various techniques and standards for multimedia data compression, especially for image compression such as the JPEG and JPEG2000 standards. These standards consist of different functions such as color space conversion and entropy coding. Arithmetic and Huffman coding are normally used in the entropy coding phase. In this paper we try to answer the following question. Which entropy coding, arithmetic or Huffman, is more suitable compared to other from the compression ratio, performance, and implementation points of view? We have implemented and tested Huffman and arithmetic algorithms. Our implemented results show that compression ratio of arithmetic coding is better than Huffman coding, while the performance of the Huffman coding is higher than A...
A high capacity MP3 steganography based on Huffman coding%基于Huffman编码的大容量MP3隐写算法
Institute of Scientific and Technical Information of China (English)
严迪群; 王让定; 张力光
2011-01-01
A high capacity steganography method for mp3 audios is proposed in this paper. According to the characteristic of Huffman coding, the code words in Huffman tables are first classified to ensure that the embedding operation does not change the bitstream structure in MP3 standard. Then secret data are embedded by replacing the corresponding code words. The embedding strategy is based on multiple-base nation system. The structure of bit stream and the size of the cover audio can be kept unchanged after embedding. The results show that the proposed method can obtain higher hiding capacity and better effi ciency than that of the method under binary case. Furthermore, the imperceptibility can also be better maintained in our method.%本文针对MP3编码标准中哈夫曼码字对特点,提出了一种借助码字替换实现秘密信息隐写的新算法.该算法首先对哈夫曼码表中的码字进行分类,以保证替换操作不改变MP3码流的固定结构,再借鉴混合进制的概念,采用多进制方式隐藏秘密信息.给出了算法在二进制和多进制两种模式下的仿真结果,表明多进制隐写模式可以获得更高的隐写速率和效率,同时算法的感知透明性也能得到较好保持.
Lia, Cesario; Carraro, Giovanni; Chiosi, Cesare; Voli, Marco
1998-01-01
In this report we describe a parallel implementation of a Tree-SPH code realized using the SHMEM libraries in the Cray T3E supercomputer at CINECA. We show the result of a 3D test to check the code performances against its scalar version. Finally we compare the load balancing and scalability of the code with PTreeSPH (Dav\\'e et al 1997), the only other parallel Tree-SPH code present in the literature.
Institute of Scientific and Technical Information of China (English)
Nishikanta Khandai; J. S. Bagla
2009-01-01
We discuss the performance characteristics of using the modification of the tree code suggested by Barnes in the context of the TreePM code. The optimization involves identifying groups of particles and using only one tree walk to compute the force for all the particles in the group. This modification has been in use in our implementation of the TreePM code for some time, and has also been used by others in codes that make use of tree structures. We present the first detailed study of the performance characteristics of this optimization. We show that the modification, if tuned properly, can speed up the TreePM code by a significant amount. We also combine this modification with the use of individual time steps and indicate how to combine these two schemes in an optimal fashion. We find that the combination is at least a factor of two faster than the modified TreePM without individual time steps. Overall performance is often faster by a larger factor because the scheme for the groups optimizes the use of cache for large simulations.
BTREE: A FORTRAN Code for B+ Tree.
2014-09-26
AD- 55 026 STREE:SA FORTRAN CODE FOR 0- TREE IUT NAVAL SURFACE 1f, WOEAPONS CENTER SI LVER SPRING WD E WINSION 01 APR 05 NSWC/TR 85-5 F/0 9/2 NL...and Subtitle) S. TYPE OF REPORT & PERIOD COVERED BTREE: A FORTRAN CODE FOR A B+ TREE Final: Fiscal Year 85 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR...reveres side It necessary d identify by block number) B+ Tree , Database Manager, Node, Leaf. Root 20. ABSTRACT (ContUum an terees side It ncee.sy al
Institute of Scientific and Technical Information of China (English)
黄荣辉; 周明天; 曾家智
2000-01-01
This thesis is to analyze the characteristics of data packets in power-line computer network and to discuss a data compression method of present study in abroad.Briefly describing the different Huffman code algorithms,it presents the data compression results by testingg the data packets in power-line computer network.The result shows that it is better to use the Advanced Dynamic Huffman Code method in power-line computer network.Finally,the methods of improving operation in engineering are proposed.
Institute of Scientific and Technical Information of China (English)
李伟生; 李域; 王涛
2005-01-01
Huffman编码作为一种高效的不等长编码技术正日益广泛地在文本、图像、视频压缩及通信、密码等领域得到应用.为了更有效地利用内存空间、简化编码步骤和相关操作,首先研究了重建Huffman树所需要的信息,并提出通过对一类一维结构数组进行相关操作来获取上述信息的方法,然后利用这些信息,并依据提出的规范Huffman树的编码性质,便能直接得到Huffman编码.与传统的Huffman算法及近年来国内外文献中提出的改进算法相比,由于该方法不需要构造Huffman树,不仅使内存需求大大减少,而且编码步骤和相关操作更简洁,因而更利于程序的实现和移植.更重要的是,该算法思路为Huffman算法的研究和发展提供了新的途径.
How to Make up the Unique Huffman Tree and Huffman Code%如何构造唯一的huffman树及唯一的huffman编码
Institute of Scientific and Technical Information of China (English)
王森
2003-01-01
本文论述了在某种特殊的情况下,如何构造一棵huffman树,并使这棵树变得唯一;如何通过唯一的huffman树构造出huffman编码,使每个huffman编码代表唯一的信息单元.
Institute of Scientific and Technical Information of China (English)
李芳; 熊英; 唐斌
2013-01-01
A new method is presented to improve the identification rate of radar jamming for the identification of radar pull-off jamming based on Huffman tree and backward cloud model.Firstly,a parameter library is built according to the jamming library,then an identification model based on Huffman tree can be established.Finally the degree of membership is used to identify jamming on each node of the tree.Compared with traditional method,the presented method deals well with the randomness and fuzziness of jamming caused by noise,and identifies jamming effectively when parameters overlap partially.%针对噪声环境中雷达干扰正确识别率较低的问题,提出了一种新的基于霍夫曼树和逆云模型联合的雷达欺骗干扰识别方法.该方法首先利用干扰数据库,提取有效的识别特征参数库,然后基于霍夫曼树建立识别模型.在每个节点,利用基于逆云模型的隶属度分类,实现待测干扰的识别.仿真结果表明,与传统的干扰识别方法相比,该识别方法能很好地应对雷达干扰的随机性和模糊性,能在干扰参数数值区间有重叠时有效识别雷达干扰.
Bédorf, Jeroen; Zwart, Simon Portegies
2012-01-01
We present a gravitational hierarchical N-body code that is designed to run efficiently on Graphics Processing Units (GPUs). All parts of the algorithm are executed on the GPU which eliminates the need for data transfer between the Central Processing Unit (CPU) and the GPU. Our tests indicate that the gravitational tree-code outperforms tuned CPU code for all parts of the algorithm and show an overall performance improvement of more than a factor 20, resulting in a processing rate of more than 2.8 million particles per second.
基于Huffman编码的多媒体加密技术研究%Research of Multimedia Encryption Based on Huffman Codeing
Institute of Scientific and Technical Information of China (English)
李莉萍; 吴蒙
2011-01-01
随着多媒体信息在移动、手持设备上应用的日益广泛,人们开始研究低复杂度、对硬件要求较小的多媒体加密技术.如今很多音视频文件格式中(如MPEG4、JPEG、MP3等)都用到了Huffman 编码,基于Huffman 编码的低复杂度的多媒体加密技术逐渐进入人们的研究视野.文章首先介绍了最早提出的基于多重Huffman码表的加密技术,并进一步分析其在抵抗唯密文攻击、已知明文攻击、选择明文攻击时的安全性,最后针对其具有的安全问题提出一种改进Huffman加密方案的建议.
Institute of Scientific and Technical Information of China (English)
魏佳圆; 温媛媛; 周诠
2015-01-01
In this paper,a lossless compression algorithm in binary image based on run-length and Huffman coding is proposed. Different image is tested by the algorithm,and the experiment results indicate that this method can perform well at images of clearer block and less texture. Moreover,the algorithm is easy to accomplish and has practical value in binary im-age applications.%文章提出一种游程编码和Huffman编码相结合的二值图像无损压缩算法，并对算法进行不同图像的仿真实验，实验结果证明本算法对分块清晰，纹理较少的二值图像压缩效果明显，算法实现简单，具有一定的实用价值。
MP3 Steganalysis based on Huffman code tabel index%基于Huffman码表索引的MP3Stego隐写分析方法
Institute of Scientific and Technical Information of China (English)
陈益如; 王让定; 严迪群
2012-01-01
MP3Stego是经典的MP3音频隐写算法之一.通过分析MP3Stego隐写算法对编码器内循环模块的影响,发现哈夫曼码表索引值在隐写前后发生了不同程度的改变.在此基础上,从待检测的MP3音频的解码参数中提取Huffman码表索引值,计算其二阶差分值,将其作为隐写分析的特征,结合SVM支持向量机实现隐写分析.实验结果表明,所提取的特征能够有效地反映MP3Stego算法在不同嵌入速率下的隐写痕迹.%MP3Stego is a typical steganographic algorithm for MP3 audio. By analysing the influence of the MP3Stego made to inner loop of MP3 encoder, it is found that the index values of Huffman table change differently after embedding. In the proposed algorithm, the index values of Huffman table are extracted from the decoder parameters. The second-order difference of the values is calculated as the steganalysis feature and SVM is used to classify the cover and stego MP3 audios. The experimental results show that the proposed algorithm is effective for detecting MP3 Stego.
New code match strategy for wideband code division multiple access code tree management
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Orthogonal variable spreading factor channelization codes are widely used to provide variable data rates for supporting different bandwidth requirements in wideband code division multiple access (WCDMA) systems. A new code match scheme for WCDMA code tree management was proposed. The code match scheme is similar to the existing crowed-first scheme. When choosing a code for a user, the code match scheme only compares the one up layer of the allocated codes, unlike the crowed-first scheme which perhaps compares all up layers. So the operation of code match scheme is simple, and the average time delay is decreased by 5.1%. The simulation results also show that the code match strategy can decrease the average code blocking probability by 8.4%.
Modified 8×8 quantization table and Huffman encoding steganography
Guo, Yongning; Sun, Shuliang
2014-10-01
A new secure steganography, which is based on Huffman encoding and modified quantized discrete cosine transform (DCT) coefficients, is provided in this paper. Firstly, the cover image is segmented into 8×8 blocks and modified DCT transformation is applied on each block. Huffman encoding is applied to code the secret image before embedding. DCT coefficients are quantized by modified quantization table. Inverse DCT(IDCT) is conducted on each block. All the blocks are combined together and the steg image is finally achieved. The experiment shows that the proposed method is better than DCT and Mahender Singh's in PSNR and Capacity.
KODE HUFFMAN UNTUK KOMPRESI PESAN
Directory of Open Access Journals (Sweden)
Erna Zuni Astuti
2013-05-01
Full Text Available Dalam ilmu komunikasi data, pesan yang dikirim kepada seseorang, seringkali ukurannya terlalu besar, sehingga membutuhkan tempat penyimpanan yang terlalu besar pula. Demikian juga pesan yang terlalu besar, akan membutuhkan waktu pengiriman yang lebih lama bila dibandingkan dengan pesan yang berukuran relatif lebih kecil. Dua masalah tersebut di atas, sebenarnya bisa diatasi dengan pengkodean pesan dengan tujuan agar isi pesan yang sebenarnya besar, bisa dibuat sesingkat mungkin sehingga waktu pengiriman yang mestinya lama bisa dibuat relatif lebih cepat dan tempat penyimpanan yang besar bisa dibuat relatif lebih efisien dibandingkan dengan sebelum dilakukan pengkodean. Dari Hasil uji coba penerapan dan penghitungan kode Huffman, maka dapat disimpulkan antara lain bahwa dengan menggunakan kode Huffman ternyata dapat mengurangi beban alias dapat mengkompres data lebih dari 50%. Kata Kunci: Kode Huffman, Kompresi Pesan, Komunikasi
Performance Improvement Of Bengali Text Compression Using Transliteration And Huffman Principle
Directory of Open Access Journals (Sweden)
Md. Mamun Hossain
2016-09-01
Full Text Available In this paper, we propose a new compression technique based on transliteration of Bengali text to English. Compared to Bengali, English is a less symbolic language. Thus transliteration of Bengali text to English reduces the number of characters to be coded. Huffman coding is well known for producing optimal compression. When Huffman principal is applied on transliterated text significant performance improvement is achieved in terms of decoding speed and space requirement compared to Unicode compression
A SORT-ONCE AND DYNAMIC ENCODING (SODE) BASED HUFFMAN CODING ALGORITHM%基于一次排序动态编码的Huffman编码算法
Institute of Scientific and Technical Information of China (English)
刘燕清; 龚声蓉
2009-01-01
Huffman编码作为一种高效的不等长编码技术正日益广泛地在文本、图像、视频等数据压缩、存储及通信等领域得到应用.为了有效提高时空效率、简化编码思想和操作,首先研究了传统Huffman 编码的算法及具体做法,并针对性地提出了一种基于一次排序动态编码的Huffman编码算法.与传统的 Huffman算法及近年来国内外文献中提出的改进算法相比,该方法从编码思想上将构树简化为线性编码,在空间复杂度相近的情况下,不仅时间复杂度上有明显降低,而且编码步骤和相关操作更简洁,更利于程序的实现和移植.实验结果验证了算法的有效性.
Bi-level image compression with tree coding
DEFF Research Database (Denmark)
Martins, Bo; Forchhammer, Søren
1996-01-01
Presently, tree coders are the best bi-level image coders. The current ISO standard, JBIG, is a good example. By organising code length calculations properly a vast number of possible models (trees) can be investigated within reasonable time prior to generating code. Three general-purpose coders...... version that without sacrificing speed brings it close to the multi-pass coders in compression performance...
Efficient coding of wavelet trees and its applications in image coding
Zhu, Bin; Yang, En-hui; Tewfik, Ahmed H.; Kieffer, John C.
1996-02-01
We propose in this paper a novel lossless tree coding algorithm. The technique is a direct extension of the bisection method, the simplest case of the complexity reduction method proposed recently by Kieffer and Yang, that has been used for lossless data string coding. A reduction rule is used to obtain the irreducible representation of a tree, and this irreducible tree is entropy-coded instead of the input tree itself. This reduction is reversible, and the original tree can be fully recovered from its irreducible representation. More specifically, we search for equivalent subtrees from top to bottom. When equivalent subtrees are found, a special symbol is appended to the value of the root node of the first equivalent subtree, and the root node of the second subtree is assigned to the index which points to the first subtree, an all other nodes in the second subtrees are removed. This procedure is repeated until it cannot be reduced further. This yields the irreducible tree or irreducible representation of the original tree. The proposed method can effectively remove the redundancy in an image, and results in more efficient compression. It is proved that when the tree size approaches infinity, the proposed method offers the optimal compression performance. It is generally more efficient in practice than direct coding of the input tree. The proposed method can be directly applied to code wavelet trees in non-iterative wavelet-based image coding schemes. A modified method is also proposed for coding wavelet zerotrees in embedded zerotree wavelet (EZW) image coding. Although its coding efficiency is slightly reduced, the modified version maintains exact control of bit rate and the scalability of the bit stream in EZW coding.
A Very Fast and Momentum-Conserving Tree Code
Dehnen, W
2000-01-01
The tree code for the approximate evaluation of gravitational forces is extended and substantially accelerated by including mutual cell-cell interactions. These are computed by a Taylor series in Cartesian coordinates and in a completely symmetric fashion, such that Newton's third law is satisfied by construction and hence momentum exactly conserved. The computational effort is further reduced by exploiting the mutual symmetry of the interactions. For typical astrophysical problems with N=10^5 and at the same level of accuracy, the new code is about four times faster than the tree code. For large N, the computational costs are found to scale almost linearly with N, which can also be supported by a theoretical argument, and the advantage over the tree code increases with ever larger N.
A Parallel Tree-SPH code for Galaxy Formation
Lia, C; Lia, Cesario; Carraro, Giovanni
1999-01-01
We describe a new implementation of a parallel Tree-SPH code with the aim to simulate Galaxy Formation and Evolution. The code has been parallelized using SHMEM, a Cray proprietary library to handle communications between the 256 processors of the Silicon Graphics T3E massively parallel supercomputer hosted by the Cineca Supercomputing Center (Bologna, Italy). The code combines the Smoothed Particle Hydrodynamics (SPH) method to solve hydro-dynamical equations with the popular Barnes and Hut (1986) tree-code to perform gravity calculation with a NlogN scaling, and it is based on the scalar Tree-SPH code developed by Carraro et al(1998)[MNRAS 297, 1021]. Parallelization is achieved distributing particles along processors according to a work-load criterion. Benchmarks, in terms of load-balance and scalability, of the code are analyzed and critically discussed against the adiabatic collapse of an isothermal gas sphere test using 20,000 particles on 8 processors. The code results balanced at more that 95% level. ...
Institute of Scientific and Technical Information of China (English)
杨敬锋; 张南峰; 李勇; 薛月菊; 吕伟; 何堃
2014-01-01
为解决通讯环境较差的农业机械作业状态数据的传输难题，该文提出了基于改进Huffman编码技术的数据压缩方法实现数据的压缩、传输、解析与解压。数据压缩与解压测试的结果表明，数据采集周期为5 s、数据长度为918.38 kb时，基于改进Huffman算法压缩的数据长度为412.56 kb，同样条件下对比传统Huffman算法压缩的数据长度498.56 kb小86 kb，压缩率从传统Huffman算法的45.71%提升至改进Huffman算法的55.08%；传统Huffman算法中数据传输出错率和数据传输丢包率为2.47%和4.18%，而在同样传输要求下的筛选压缩传输中数据传输出错率和数据传输丢包率降至2.06%和0.78%。该方法能满足农业机械作业状态数据压缩传输要求，在单个数据包数据较少、传输时间短的压缩传输方式下能够获得较低的传输出错率和丢包率，且该方法具有计算量少、压缩效率较高特点，适合在农业机械作业区域进行数据传输。%In order to solve the poor communication environment problem of agricultural machinery operation state data transmission caused by the unbalanced coverage of a mobile communication base station, a data filtering and data compression method based on an improved Huffman coding technique was proposed for data selecting, compression, transmission, parsing, and extracting in this paper. First, the agricultural machinery operation data types, exchange mode, and compression mode were defined. Then, data collection and exchange were realized based on a Compass/GPS dual-mode state data collection terminal. Finally, an improved Huffman coding technique was proposed. At present, most of the data transmission is using a compression-decompression method to ensure data integrity of data transmission, which can reduce the data traffic and save many communication costs, but its disadvantages are also obvious. The disadvantages are fewer on the terminal in the data
Adaptive zero-tree structure for curved wavelet image coding
Zhang, Liang; Wang, Demin; Vincent, André
2006-02-01
We investigate the issue of efficient data organization and representation of the curved wavelet coefficients [curved wavelet transform (WT)]. We present an adaptive zero-tree structure that exploits the cross-subband similarity of the curved wavelet transform. In the embedded zero-tree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT), the parent-child relationship is defined in such a way that a parent has four children, restricted to a square of 2×2 pixels, the parent-child relationship in the adaptive zero-tree structure varies according to the curves along which the curved WT is performed. Five child patterns were determined based on different combinations of curve orientation. A new image coder was then developed based on this adaptive zero-tree structure and the set-partitioning technique. Experimental results using synthetic and natural images showed the effectiveness of the proposed adaptive zero-tree structure for encoding of the curved wavelet coefficients. The coding gain of the proposed coder can be up to 1.2 dB in terms of peak SNR (PSNR) compared to the SPIHT coder. Subjective evaluation shows that the proposed coder preserves lines and edges better than the SPIHT coder.
Parallel TREE code for two-component ultracold plasma analysis
Jeon, Byoungseon; Kress, Joel D.; Collins, Lee A.; Grønbech-Jensen, Niels
2008-02-01
The TREE method has been widely used for long-range interaction N-body problems. We have developed a parallel TREE code for two-component classical plasmas with open boundary conditions and highly non-uniform charge distributions. The program efficiently handles millions of particles evolved over long relaxation times requiring millions of time steps. Appropriate domain decomposition and dynamic data management were employed, and large-scale parallel processing was achieved using an intermediate level of granularity of domain decomposition and ghost TREE communication. Even though the computational load is not fully distributed in fine grains, high parallel efficiency was achieved for ultracold plasma systems of charged particles. As an application, we performed simulations of an ultracold neutral plasma with a half million particles and a half million time steps. For the long temporal trajectories of relaxation between heavy ions and light electrons, large configurations of ultracold plasmas can now be investigated, which was not possible in past studies.
Video Coding Using 3D Dual-Tree Wavelet Transform
Directory of Open Access Journals (Sweden)
Vetro Anthony
2007-01-01
Full Text Available This work investigates the use of the 3D dual-tree discrete wavelet transform (DDWT for video coding. The 3D DDWT is an attractive video representation because it isolates image patterns with different spatial orientations and motion directions and speeds in separate subbands. However, it is an overcomplete transform with 4: 1 redundancy when only real parts are used. We apply the noise-shaping algorithm proposed by Kingsbury to reduce the number of coefficients. To code the remaining significant coefficients, we propose two video codecs. The first one applies separate 3D set partitioning in hierarchical trees (SPIHT on each subset of the DDWT coefficients (each forming a standard isotropic tree. The second codec exploits the correlation between redundant subbands, and codes the subbands jointly. Both codecs do not require motion compensation and provide better performance than the 3D SPIHT codec using the standard DWT, both objectively and subjectively. Furthermore, both codecs provide full scalability in spatial, temporal, and quality dimensions. Besides the standard isotropic decomposition, we propose an anisotropic DDWT, which extends the superiority of the normal DDWT with more directional subbands without adding to the redundancy. This anisotropic structure requires significantly fewer coefficients to represent a video after noise shaping. Finally, we also explore the benefits of combining the 3D DDWT with the standard DWT to capture a wider set of orientations.
基于Matlab文本文件哈夫曼编解码仿真%Simulation of Huffman codec of text based on Matlab
Institute of Scientific and Technical Information of China (English)
王向鸿
2013-01-01
根据当前数据压缩技术的现状，论述了Huffman可变长压缩的编解码方法。为了验证Huffman编解码的具体过程和特点，采用Matlab软件编程仿真的方法，将优先队列转成二叉树并建立编码和解码的字典表。对一随机英文文本文件进行了Huffman编解码仿真，得到了各个字母的概率、码字、平均信息量、平均长度、冗余度以及编码解码序列输出，具有明确的压缩特点。%According to the current situationof the data-compression technology,Huffman codec method which can change codon length to compress is described in this paper. In order to validate the course and characteristics of Huffman-encode-de-code,a method of programming simulation based on Matlab was adopted to convert the priority-queue to binary-tree and consti-tute a code-table of encoding and decoding,and conduct the Huffman-encode-decode simulation of a random English text. The output of the probability,codon,entropy,average length,redundancy,encoding sequence and decoding sequence of each let-ter was obtained,which has a definite compression feature.
Potent Tree Codes and their applications: Coding for Interactive Communication, revisited
Gelles, Ran
2011-01-01
We study the fundamental problem of reliable interactive communication over a noisy channel. In a breakthrough sequence of papers published in 1992 and 1993, Schulman gave non-constructive proofs of the existence of general methods to emulate any two-party interactive protocol such that: (1) the emulation protocol takes a constant-factor longer than the original protocol, and (2) if the emulation protocol is executed over a noisy channel, then the probability that the emulation protocol fails is exponentially small in the total length of the protocol. Unfortunately, Schulman's emulation procedures either only work in a model with a large amount of shared randomness, or are non-constructive in that they rely on the existence of good tree codes. The only known proofs of the existence of good tree codes are non-constructive, and finding an explicit construction remains an important open problem. Indeed, randomly generated tree codes are not good tree codes with overwhelming probability. In this work, we revisit ...
A modified tree code: Don't laugh; It runs
Barnes, Joshua E.
1990-03-01
I describe a modification of the Barnes-Hut tree algorithm together with a series of numerical tests of this method. The basic idea is to improve the performance of the code on heavily vector-oriented machines such as the Cyber 205 by exploiting the fact that nearby particles tend to have very similar interaction lists. By building an interaction list good everywhere within a cell containing a modest number of particles and reusing this interaction list for each particle in the cell in turn, the balance of computation can be shifted from recursive descent to force summation. Instead of vectorizing tree descent, this scheme simply avoids it in favor of force summation, which is quite easy to vectorize. A welcome side-effect of this modification is that the force calculation, which now treats a larger fraction of the local interactions exactly, is significantly more accurate than the unmodified method.
Directory of Open Access Journals (Sweden)
Vinay U. Kale
2010-05-01
Full Text Available This paper proposes a technique for image compression which uses the Wavelet-based Image/Texture Coding Hybrid (WITCH scheme [1] in combination with Huffman encoder. It implements a hybrid coding approach, while nevertheless preserving the features of progressive and lossless coding. The hybrid scheme was designed to encode the structural image information by Embedded Zerotree Wavelet (EZW encoding algorithm [2] and the stochastic texture in a model-based manner and this encoded data is then compressed using Huffman encoder. The scheme proposed here achieves superior subjective quality while increasing the compression ratio by more than a factor of three or even four. With this technique, it is possible to achieve compression ratios as high as 10 to 12 but with some minor distortions in the encoded image.
Prefix Codes: Equiprobable Words, Unequal Letter Costs
Golin, Mordecai; Young, Neal E.
2002-01-01
Describes a near-linear-time algorithm for a variant of Huffman coding, in which the letters may have non-uniform lengths (as in Morse code), but with the restriction that each word to be encoded has equal probability. [See also ``Huffman Coding with Unequal Letter Costs'' (2002).
Bode, P; Bode, Paul; Ostriker, Jeremiah P.
2003-01-01
An improved implementation of an N-body code for simulating collisionless cosmological dynamics is presented. TPM (Tree-Particle-Mesh) combines the PM method on large scales with a tree code to handle particle-particle interactions at small separations. After the global PM forces are calculated, spatially distinct regions above a given density contrast are located; the tree code calculates the gravitational interactions inside these denser objects at higher spatial and temporal resolution. The new implementation includes individual particle time steps within trees, an improved treatment of tidal forces on trees, new criteria for higher force resolution and choice of time step, and parallel treatment of large trees. TPM is compared to P^3M and a tree code (GADGET) and is found to give equivalent results in significantly less time. The implementation is highly portable (requiring a Fortran compiler and MPI) and efficient on parallel machines. The source code can be found at http://astro.princeton.edu/~bode/TPM/
Exploration of Extreme Mass Ratio Inspirals with a Tree Code
Miller, Michael
Extreme mass ratio inspirals (EMRIs), in which a stellar-mass object spirals into a supermassive black hole, are critical gravitational wave sources for the Laser Interferometer Space Antenna (LISA) because of their potential as precise probes of strong gravity. They are although thought to contribute to the flares observed in a few active galactic nuclei that have been attributed to tidal disruption of stars. There are, however, large uncertainties about the rates and properties of EMRIs. The reason is that their galactic nuclear environments contain millions of stars around a central massive object, and their paths must be integrated with great precision to include properly effects such as secular resonances, which accumulate over many orbits. Progress is being made on all fronts, but current numerical options are either profoundly computationally intensive (direct N-body integrators, which in addition do not currently have the needed long-term accuracy) or require special symmetry or other simplifications that may compromise the realism of the results (Monte Carlo and Fokker-Planck codes). We propose to undertake extensive simulations of EMRIs using tree codes that we have adapted to the problem. Tree codes are much faster than direct N-body simulations, yet they are powerful and flexible enough to include nonideal physics such as triaxiality, arbitrary mass spectra, post-Newtonian corrections, and secular evolutionary effects such as resonant relaxation and Kozai oscillations to the equations of motion. We propose to extend our codes to include these effects and to allow separate tracking of special ? that will represent binaries, thus allowing us to follow their interactions and evolution. In our development we will compare our results for a few tens of thousands of particles with a state of the art direct N-body integrator, to evaluate the accuracy of our code and discern systematic effects. This will allow detailed yet fast examinations of large-N systems
A Multi-coordinate Linkage Interpolation Method with Huffman Code Tree%用Huffman树实现的多坐标联动插补算法
Institute of Scientific and Technical Information of China (English)
李志勇; 赵万生; 张勇
2003-01-01
将多轴联动插补指令的各坐标相对移动值作为树中节点的权值,用Huffman算法建立插补树,每次插补计算时使用逐点比较法搜索一遍插补树.基于动态Huffman编码树的坐标分组是最优的,在插补运算中具有最快的速度.以联动轴数作为输入考察插补速度,算法时间复杂度是对数阶的.该算法用于电火花机床加工航空和火箭发动机带叶冠整体涡轮叶片.
Efficient coding and decoding algorithm based on generalized Huffman tree%基于广义规范Huffman树的高效编解码算法
Institute of Scientific and Technical Information of China (English)
郭建光; 张卫杰; 杨健; 安文韬; 熊涛
2009-01-01
为了减少编码时消耗的时间和空间,以便适应实时处理,提出了基于广义规范Huffman树的高效数据压缩算法.该算法利用层次和概率表顺序,保证编、解码的唯一性;利用移动排序替代搜索;建立索引表来简化排序操作;融入均衡编码的思想.同时,根据编码思想提出了相应的解码算法.通过实际数据验证,与传统的Huffnmn算法相比,该算法在时间和空间效率上有了一定提高,且使得码字更为均衡.
TreePM: A Code for Cosmological N-Body Simulations
Indian Academy of Sciences (India)
J. S. Bagla
2002-09-01
We describe the TreePM method for carrying out large N-Body simulations to study formation and evolution of the large scale structure in the Universe. This method is a combination of Barnes and Hut tree code and Particle-Mesh code. It combines the automatic inclusion of periodic boundary conditions of PM simulations with the high resolution of tree codes. This is done by splitting the gravitational force into a short range and a long range component. We describe the splitting of force between these two parts.We outline the key differences between TreePM and some other N-Body methods.
Institute of Scientific and Technical Information of China (English)
毕智超
2011-01-01
最优二叉树是一种十分重要的数据结构,首先针对最优二叉树--哈夫曼(Huffman)树进行探讨分析并给出算法描述,然后通过快速排序算法将带排序的数据进行排序处理,使哈夫曼算法的时间复杂度降低.最后基于哈夫曼树在编码问题中的应用--哈夫曼编码(Huffman Code),通过简要的说明对哈夫曼编码的存储结构进行了改进.
Directory of Open Access Journals (Sweden)
H. Prashantha Kumar
2011-09-01
Full Text Available Low density parity check (LDPC codes are capacity-approaching codes, which means that practical constructions exist that allow the noise threshold to be set very close to the theoretical Shannon limit for a memory less channel. LDPC codes are finding increasing use in applications like LTE-Networks, digital television, high density data storage systems, deep space communication systems etc. Several algebraic and combinatorial methods are available for constructing LDPC codes. In this paper we discuss a novel low complexity algebraic method for constructing regular LDPC like codes derived from full rank codes. We demonstrate that by employing these codes over AWGN channels, coding gains in excess of 2dB over un-coded systems can be realized when soft iterative decoding using a parity check tree is employed.
Directory of Open Access Journals (Sweden)
Syed Mahamud Hossein
2013-09-01
Full Text Available Storing, transmitting and security of DNA sequences are well known research challenge. The problem has got magnified with increasing discovery and availability of DNA sequences. We have represent DNA sequence compression algorithm based on Dynamic Look Up Table (DLUT and modified Huffman technique. DLUT consists of 43(64 bases that are 64 sub-stings, each sub-string is of 3 bases long. Each sub-string are individually coded by single ASCII code from 33(! to 96(` and vice versa. Encode depends on encryption key choose by user from four base pair {a,t.g and c}and decode also require decryption key provide by the encoded user. Decoding must require authenticate input for encode the data. The sub-strings are combined into a Dynamic Look up Table based pre-coding routine. This algorithm is tested on reverse; complement & reverse complement the DNA sequences and also test on artificial DNA sequences of equivalent length. Speed of encryption and security levels are two important measurements for evaluating any encryption system. Due to proliferate of ubiquitous computing system, where digital contents are accessible through resource constraint biological database security concern is very important issue. A lot of research has been made to find an encryption system which can be run effectively in those biological databases. Information security is the most challenging question to protect the data from unauthorized user. The proposed method may protect the data from hackers. It can provide the three tier security, in tier one is ASCII code, in tier two is nucleotide (a,t,g and c choice by user and tier three is change of label or change of node position in Huffman Tree. Compression of the genome sequences will help to increase the efficiency of their use. The greatest advantage of this algorithm is fast execution, small memory occupation and easy implementation. Since the program to implement the technique have been written originally in the C language
A novel technique for image steganography based on Block-DCT and Huffman Encoding
Directory of Open Access Journals (Sweden)
A.Nag
2010-06-01
Full Text Available Image steganography is the art of hiding information into a cover image. This paper presents anovel technique for Image steganography based on Block-DCT, where DCT is used to transform originalimage (cover image blocks from spatial domain to frequency domain. Firstly a gray level image of size M× N is divided into no joint 8 × 8 blocks and a two dimensional Discrete Cosine Transform(2-d DCT isperformed on each of the P = MN / 64 blocks. Then Huffman encoding is also performed on the secretmessages/images before embedding and each bit of Huffman code of secret message/image is embedded inthe frequency domain by altering the least significant bit of each of the DCT coefficients of cover imageblocks. The experimental results show that the algorithm has a high capacity and a good invisibility.Moreover PSNR of cover image with stego-image shows the better results in comparison with otherexisting steganography approaches. Furthermore, satisfactory security is maintained since the secretmessage/image cannot be extracted without knowing decoding rules and Huffman table.
WASTK: A Weighted Abstract Syntax Tree Kernel Method for Source Code Plagiarism Detection
Directory of Open Access Journals (Sweden)
Deqiang Fu
2017-01-01
Full Text Available In this paper, we introduce a source code plagiarism detection method, named WASTK (Weighted Abstract Syntax Tree Kernel, for computer science education. Different from other plagiarism detection methods, WASTK takes some aspects other than the similarity between programs into account. WASTK firstly transfers the source code of a program to an abstract syntax tree and then gets the similarity by calculating the tree kernel of two abstract syntax trees. To avoid misjudgment caused by trivial code snippets or frameworks given by instructors, an idea similar to TF-IDF (Term Frequency-Inverse Document Frequency in the field of information retrieval is applied. Each node in an abstract syntax tree is assigned a weight by TF-IDF. WASTK is evaluated on different datasets and, as a result, performs much better than other popular methods like Sim and JPlag.
Structural Analysis and Visualization of C++ Code Evolution using Syntax Trees
Chevalier, Fanny; Auber, David; Telea, Alexandru
2007-01-01
We present a method to detect and visualize evolution patterns in C++ source code. Our method consists of three steps. First, we extract an annotated syntax tree (AST) from each version of a given C++ source code. Next, we hash the extracted syntax nodes based on a metric combining structure and typ
Structural Analysis and Visualization of C++ Code Evolution using Syntax Trees
Chevalier, Fanny; Auber, David; Telea, Alexandru
2007-01-01
International audience; We present a method to detect and visualize evolution patterns in C++ source code. Our method consists of three steps. First, we extract an annotated syntax tree (AST) from each version of a given C++ source code. Next, we hash the extracted syntax nodes based on a metric combining structure and type information, and construct matches (correspondences) between similar-hash subtrees. Our technique detects code fragments which have not changed, or changed little, during ...
STACK DECODING OF LINEAR BLOCK CODES FOR DISCRETE MEMORYLESS CHANNEL USING TREE DIAGRAM
Directory of Open Access Journals (Sweden)
H. Prashantha Kumar
2012-03-01
Full Text Available The boundaries between block and convolutional codes have become diffused after recent advances in the understanding of the trellis structure of block codes and the tail-biting structure of some convolutional codes. Therefore, decoding algorithms traditionally proposed for decoding convolutional codes have been applied for decoding certain classes of block codes. This paper presents the decoding of block codes using tree structure. Many good block codes are presently known. Several of them have been used in applications ranging from deep space communication to error control in storage systems. But the primary difficulty with applying Viterbi or BCJR algorithms to decode of block codes is that, even though they are optimum decoding methods, the promised bit error rates are not achieved in practice at data rates close to capacity. This is because the decoding effort is fixed and grows with block length, and thus only short block length codes can be used. Therefore, an important practical question is whether a suboptimal realizable soft decision decoding method can be found for block codes. A noteworthy result which provides a partial answer to this question is described in the following sections. This result of near optimum decoding will be used as motivation for the investigation of different soft decision decoding methods for linear block codes which can lead to the development of efficient decoding algorithms. The code tree can be treated as an expanded version of the trellis, where every path is totally distinct from every other path. We have derived the tree structure for (8, 4 and (16, 11 extended Hamming codes and have succeeded in implementing the soft decision stack algorithm to decode them. For the discrete memoryless channel, gains in excess of 1.5dB at a bit error rate of 10-5 with respect to conventional hard decision decoding are demonstrated for these codes.
Some possible codes for encrypting data in DNA.
Smith, Geoff C; Fiddes, Ceridwyn C; Hawkins, Jonathan P; Cox, Jonathan P L
2003-07-01
Three codes are reported for storing written information in DNA. We refer to these codes as the Huffman code, the comma code and the alternating code. The Huffman code was devised using Huffman's algorithm for constructing economical codes. The comma code uses a single base to punctuate the message, creating an automatic reading frame and DNA which is obviously artificial. The alternating code comprises an alternating sequence of purines and pyrimidines, again creating DNA that is clearly artificial. The Huffman code would be useful for routine, short-term storage purposes, supposing--not unrealistically--that very fast methods for assembling and sequencing large pieces of DNA can be developed. The other two codes would be better suited to archiving data over long periods of time (hundreds to thousands of years).
Enhanced motion coding in MC-EZBC
Chen, Junhua; Zhang, Wenjun; Wang, Yingkun
2005-07-01
Since hierarchical variable size block matching and bidirectional motion compensation are used in the motioncompensated embedded zero block coding (MC-EZBC), the motion information consists of motion vector quadtree map and motion vectors. In the conventional motion coding scheme, the quadtree structure is coded directly, the motion vector modes are coded with Huffman codes, and the motion vector differences are coded by an m-ary arithmetic coder with 0-order models. In this paper we propose a new motion coding scheme which uses an extension of the CABAC algorithm and new context modeling for quadtree structure coding and mode coding. In addition, we use a new scalable motion coding method which scales the motion vector quadtrees according to the rate-distortion slope of the tree nodes. Experimental results show that the new coding scheme increases the efficiency of the motion coding by more than 25%. The performance of the system is improved accordingly, especially in low bit rates. Moreover, with the scalable motion coding, the subjective and objective coding performance is further enhanced in low bit rate scenarios.
Encoding of multi-alphabet sources by binary arithmetic coding
Guo, Muling; Oka, Takahumi; Kato, Shigeo; Kajiwara, Hiroshi; Kawamura, Naoto
1998-12-01
In case of encoding a multi-alphabet source, the multi- alphabet symbol sequence can be encoded directly by a multi- alphabet arithmetic encoder, or the sequence can be first converted into several binary sequences and then each binary sequence is encoded by binary arithmetic encoder, such as the L-R arithmetic coder. Arithmetic coding, however, requires arithmetic operations for each symbol and is computationally heavy. In this paper, a binary representation method using Huffman tree is introduced to reduce the number of arithmetic operations, and a new probability approximation for L-R arithmetic coding is further proposed to improve the coding efficiency when the probability of LPS (Least Probable Symbol) is near 0.5. Simulation results show that our proposed scheme has high coding efficacy and can reduce the number of coding symbols.
Tree Codes Improve Convergence Rate of Consensus Over Erasure Channels
Sukhavasi, Ravi Teja
2012-01-01
We study the problem of achieving average consensus between a group of agents over a network with erasure links. In the context of consensus problems, the unreliability of communication links between nodes has been traditionally modeled by allowing the underlying graph to vary with time. In other words, depending on the realization of the link erasures, the underlying graph at each time instant is assumed to be a subgraph of the original graph. Implicit in this model is the assumption that the erasures are symmetric: if at time t the packet from node i to node j is dropped, the same is true for the packet transmitted from node j to node i. However, in practical wireless communication systems this assumption is unreasonable and, due to the lack of symmetry, standard averaging protocols cannot guarantee that the network will reach consensus to the true average. In this paper we explore the use of channel coding to improve the performance of consensus algorithms. For symmetric erasures, we show that, for certain...
An implementation of a tree code on a SIMD, parallel computer
Olson, Kevin M.; Dorband, John E.
1994-01-01
We describe a fast tree algorithm for gravitational N-body simulation on SIMD parallel computers. The tree construction uses fast, parallel sorts. The sorted lists are recursively divided along their x, y and z coordinates. This data structure is a completely balanced tree (i.e., each particle is paired with exactly one other particle) and maintains good spatial locality. An implementation of this tree-building algorithm on a 16k processor Maspar MP-1 performs well and constitutes only a small fraction (approximately 15%) of the entire cycle of finding the accelerations. Each node in the tree is treated as a monopole. The tree search and the summation of accelerations also perform well. During the tree search, node data that is needed from another processor is simply fetched. Roughly 55% of the tree search time is spent in communications between processors. We apply the code to two problems of astrophysical interest. The first is a simulation of the close passage of two gravitationally, interacting, disk galaxies using 65,636 particles. We also simulate the formation of structure in an expanding, model universe using 1,048,576 particles. Our code attains speeds comparable to one head of a Cray Y-MP, so single instruction, multiple data (SIMD) type computers can be used for these simulations. The cost/performance ratio for SIMD machines like the Maspar MP-1 make them an extremely attractive alternative to either vector processors or large multiple instruction, multiple data (MIMD) type parallel computers. With further optimizations (e.g., more careful load balancing), speeds in excess of today's vector processing computers should be possible.
A New Class of TAST Codes With A Simplified Tree Structure
Damen, Mohamed Oussama; Badr, Ahmed A
2010-01-01
We consider in this paper the design of full diversity and high rate space-time codes with moderate decoding complexity for arbitrary number of transmit and receive antennas and arbitrary input alphabets. We focus our attention to codes from the threaded algebraic space-time (TAST) framework since the latter includes most known full diversity space-time codes. We propose a new construction of the component single-input single-output (SISO) encoders such that the equivalent code matrix has an upper triangular form. We accomplish this task by designing each SISO encoder to create an ISI-channel in each thread. This, in turn, greatly simplifies the QR-decomposition of the composite channel and code matrix, which is essential for optimal or near-optimal tree search algorithms, such as the sequential decoder.
A novel technique for image steganography based on Block-DCT and Huffman Encoding
Nag, A; Sarkar, D; Sarkar, P P; 10.5121/ijcsit.2010.2308
2010-01-01
Image steganography is the art of hiding information into a cover image. This paper presents a novel technique for Image steganography based on Block-DCT, where DCT is used to transform original image (cover image) blocks from spatial domain to frequency domain. Firstly a gray level image of size M x N is divided into no joint 8 x 8 blocks and a two dimensional Discrete Cosine Transform (2-d DCT) is performed on each of the P = MN / 64 blocks. Then Huffman encoding is also performed on the secret messages/images before embedding and each bit of Huffman code of secret message/image is embedded in the frequency domain by altering the least significant bit of each of the DCT coefficients of cover image blocks. The experimental results show that the algorithm has a high capacity and a good invisibility. Moreover PSNR of cover image with stego-image shows the better results in comparison with other existing steganography approaches. Furthermore, satisfactory security is maintained since the secret message/image ca...
Bit-Based Joint Source-Channel Decoding of Huffman Encoded Markov Multiple Sources
Directory of Open Access Journals (Sweden)
Weiwei Xiang
2010-04-01
Full Text Available Multimedia transmission over time-varying channels such as wireless channels has recently motivated the research on the joint source-channel technique. In this paper, we present a method for joint source-channel soft decision decoding of Huffman encoded multiple sources. By exploiting the a priori bit probabilities in multiple sources, the decoding performance is greatly improved. Compared with the single source decoding scheme addressed by Marion Jeanne, the proposed technique is more practical in wideband wireless communications. Simulation results show our new method obtains substantial improvements with a minor increasing of complexity. For two sources, the gain in SNR is around 1.5dB by using convolutional codes when symbol-error rate (SER reaches 10-2 and around 2dB by using Turbo codes.
A Biblock Wavelet Zero Tree Coding for Hyperspectral Imagery Data Compression
Institute of Scientific and Technical Information of China (English)
YAN Jingwen; SHEN Guiming; HU Xiaoyi; XU Fang
2001-01-01
In this paper, a biblock zero tree compression coding (BBZTC) method, based on wavelet zero tree compression coding (ZTC), is used to exploit redundancy in hyperspectral imagery data. Because of ZTC scanning every wavelet zero tree coefficients with low efficiency, BBZTC method outperforms ZTC in the aspects of high compression ratio, simple real-time implementing, and higher coding/decoding speed, and convenient real-time transmission. The experimental results show that this method can obtain the compression ratio 17～40 times of 224 spectral bands to remove the spectral correlation without any coding by using KLT, and the total compression performance of KLT+BBZTC method is better than the method of KLT and other one dimensional transformation (such as DCT) to remove the spectral correlation. To compare with the total compression ratio of KLT+JPEG method and SFCVQ method, this method reaches 180 at the PSNR of 33.6 db. This method is superior to the present any other method at compression rate.
Ultraspectral sounder data compression using the non-exhaustive Tunstall coding
Wei, Shih-Chieh; Huang, Bormin
2008-08-01
With its bulky volume, the ultraspectral sounder data might still suffer a few bits of error after channel coding. Therefore it is beneficial to incorporate some mechanism in source coding for error containment. The Tunstall code is a variable-to- fixed length code which can reduce the error propagation encountered in fixed-to-variable length codes like Huffman and arithmetic codes. The original Tunstall code uses an exhaustive parse tree where internal nodes extend every symbol in branching. It might result in assignment of precious codewords to less probable parse strings. Based on an infinitely extended parse tree, a modified Tunstall code is proposed which grows an optimal non-exhaustive parse tree by assigning the complete codewords only to top probability nodes in the infinite tree. Comparison will be made among the original exhaustive Tunstall code, our modified non-exhaustive Tunstall code, the CCSDS Rice code, and JPEG-2000 in terms of compression ratio and percent error rate using the ultraspectral sounder data.
A Parallel Implementation of Improved Fractal Image Coding Based on Tree Topology
Institute of Scientific and Technical Information of China (English)
SUNYunda,; ZHAOYao; YUANBaozong
2003-01-01
One of the main drawbacks of fractai im-age coding (FIC) is its time-consuming encoding process.So how to speed up the encoding process is a challenging issue of FIC research. As both sequential solutions and parallel ones have their advantages and disadvantages, we combine them together to further speed up the encoding phase. In this paper a derivative tree topology is first pro-posed to provide support for complex parallelism. Then a dual-classification technique is designed for speeding up the fractai image coding with Same-Sized Block Mapping,which improves the decoded image quality. Finally, some experimental results with good performance are presented.
Linear Network Coding on Multi-Mesh of Trees using All to All Broadcast
Directory of Open Access Journals (Sweden)
Nitin Rakesh
2011-05-01
Full Text Available We introduce linear network coding on parallel architecture for multi-source finite acyclic network. In this problem, different messages in diverse time periods are broadcast and every non-source node in the network decodes and encodes the message based on further communication.We wish to minimize the communication steps and time complexity involved in transfer of data from node-to-node during parallel communication.We have used Multi-Mesh of Trees (MMT topology for implementing network coding. To envisage our result, we use all-to-all broadcast as communication algorithm.
GOTHIC: Gravitational oct-tree code accelerated by hierarchical time step controlling
Miki, Yohei; Umemura, Masayuki
2017-04-01
The tree method is a widely implemented algorithm for collisionless N-body simulations in astrophysics well suited for GPU(s). Adopting hierarchical time stepping can accelerate N-body simulations; however, it is infrequently implemented and its potential remains untested in GPU implementations. We have developed a Gravitational Oct-Tree code accelerated by HIerarchical time step Controlling named GOTHIC, which adopts both the tree method and the hierarchical time step. The code adopts some adaptive optimizations by monitoring the execution time of each function on-the-fly and minimizes the time-to-solution by balancing the measured time of multiple functions. Results of performance measurements with realistic particle distribution performed on NVIDIA Tesla M2090, K20X, and GeForce GTX TITAN X, which are representative GPUs of the Fermi, Kepler, and Maxwell generation of GPUs, show that the hierarchical time step achieves a speedup by a factor of around 3-5 times compared to the shared time step. The measured elapsed time per step of GOTHIC is 0.30 s or 0.44 s on GTX TITAN X when the particle distribution represents the Andromeda galaxy or the NFW sphere, respectively, with 224 = 16,777,216 particles. The averaged performance of the code corresponds to 10-30% of the theoretical single precision peak performance of the GPU.
A GPU accelerated Barnes-Hut tree code for FLASH4
Lukat, Gunther; Banerjee, Robi
2016-05-01
We present a GPU accelerated CUDA-C implementation of the Barnes Hut (BH) tree code for calculating the gravitational potential on octree adaptive meshes. The tree code algorithm is implemented within the FLASH4 adaptive mesh refinement (AMR) code framework and therefore fully MPI parallel. We describe the algorithm and present test results that demonstrate its accuracy and performance in comparison to the algorithms available in the current FLASH4 version. We use a MacLaurin spheroid to test the accuracy of our new implementation and use spherical, collapsing cloud cores with effective AMR to carry out performance tests also in comparison with previous gravity solvers. Depending on the setup and the GPU/CPU ratio, we find a speedup for the gravity unit of at least a factor of 3 and up to 60 in comparison to the gravity solvers implemented in the FLASH4 code. We find an overall speedup factor for full simulations of at least factor 1.6 up to a factor of 10.
A GPU accelerated Barnes-Hut Tree Code for FLASH4
Lukat, Gunther
2016-01-01
We present a GPU accelerated CUDA-C implementation of the Barnes Hut (BH) tree code for calculating the gravita- tional potential on octree adaptive meshes. The tree code algorithm is implemented within the FLASH4 adaptive mesh refinement (AMR) code framework and therefore fully MPI parallel. We describe the algorithm and present test results that demonstrate its accuracy and performance in comparison to the algorithms available in the current FLASH4 version. We use a MacLaurin spheroid to test the accuracy of our new implementation and use spherical, collapsing cloud cores with effective AMR to carry out performance tests also in comparison with previous gravity solvers. Depending on the setup and the GPU/CPU ratio, we find a speedup for the gravity unit of at least a factor of 3 and up to 60 in comparison to the gravity solvers implemented in the FLASH4 code. We find an overall speedup factor for full simulations of at least factor 1.6 up to a factor of 10
Perceptual Zero-Tree Coding with Efficient Optimization for Embedded Platforms
Directory of Open Access Journals (Sweden)
B. F. Wu
2013-08-01
Full Text Available This study proposes a block-edge-based perceptual zero-tree coding (PZTC method, which is implemented with efficientoptimization on the embedded platform. PZTC combines two novel compression concepts for coding efficiency and quality:block-edge detection (BED and the low-complexity and low-memory entropy coder (LLEC. The proposed PZTC wasimplemented as a fixed-point version and optimized on the DSP-based platform based on both the presented platformindependentand platform-dependent optimization technologies. For platform-dependent optimization, this study examinesthe fixed-point PZTC and analyzes the complexity to optimize PZTC toward achieving an optimal coding efficiency.Furthermore, hardware-based platform-dependent optimizations are presented to reduce the memory size. Theperformance, such as compression quality and efficiency, is validated by experimental results.
GOTHIC: Gravitational oct-tree code accelerated by hierarchical time step controlling
Miki, Yohei
2016-01-01
The tree method is a widely implemented algorithm for collisionless $N$-body simulations in astrophysics well suited for GPU(s). Adopting hierarchical time stepping can accelerate $N$-body simulations; however, it is infrequently implemented and its potential remains untested in GPU implementations. We have developed a Gravitational Oct-Tree code accelerated by HIerarchical time step Controlling named \\texttt{GOTHIC}, which adopts both the tree method and the hierarchical time step. The code adopts some adaptive optimizations by monitoring the execution time of each function on-the-fly and minimizes the time-to-solution by balancing the measured time of multiple functions. Results of performance measurements with realistic particle distribution performed on NVIDIA Tesla M2090, K20X, and GeForce GTX TITAN X, which are representative GPUs of the Fermi, Kepler, and Maxwell generation of GPUs, show that the hierarchical time step achieves a speedup by a factor of around 3--5 times compared to the shared time step...
Does an Arithmetic Coding Followed by Run-length Coding Enhance the Compression Ratio?
Directory of Open Access Journals (Sweden)
Mohammed Otair
2015-07-01
Full Text Available Compression is a technique to minimize the quantity of image without excessively decreasing the quality of the image. Then, the translating of compressed image is much more efficient and rapidly than original image. Arithmetic and Huffman coding are mostly used techniques in the entropy coding. This study tries to prove that RLC may be added after Arithmetic coding as an extra processing step which may therefore be coded efficiently without any further degradation of the image quality. So, the main purpose of this study is to answer the following question "Which entropy coding, arithmetic with RLC or Huffman with RLC, is more suitable from the compression ratio perspective?" Finally, experimental results show that an Arithmetic followed by RLC coding yields better compression performance than Huffman with RLC coding.
Implementation of a tree algorithm in MCNP code for nuclear well logging applications
Energy Technology Data Exchange (ETDEWEB)
Li Fusheng, E-mail: fusheng.li@bakerhughes.com [Baker Hughes Incorporated, 2001 Rankin Rd. Houston, TX 77073-5101 (United States); Han Xiaogang [Baker Hughes Incorporated, 2001 Rankin Rd. Houston, TX 77073-5101 (United States)
2012-07-15
The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length. - Highlights: Black-Right-Pointing-Pointer Tree structure programming is suitable for Monte-Carlo based particle tracking. Black-Right-Pointing-Pointer Enhanced pulse height tally is developed for oilwell logging tool simulation. Black-Right-Pointing-Pointer Neutron interaction tally and gamma ray index tally for geochemical logging.
A method for detecting code security vulnerability based on variables tracking with validated-tree
Institute of Scientific and Technical Information of China (English)
2008-01-01
SQL injection poses a major threat to the application level security of the database and there is no systematic solution to these attacks.Different from traditional run time security strategies such as IDS and fire wall,this paper focuses on the solution at the outset;it presents a method to find vulnerabilities by analyzing the source codes.The concept of validated tree is developed to track variables referenced by database operations in scripts.By checking whether these variables are influenced by outside inputs,the database operations are proved to be secure or not.This method has advantages of high accuracy and efficiency as well as low costs,and it is universal to any type of web application platforms.It is implemented by the SOftware code vulnerabilities of SQL injection detector(CVSID).The validity and efficiency are demonstrated with an example.
Galaxy Formation and Evolution; 1, The Padua TreeSPH code (PD-SPH)
Carraro, G; Chiosi, C; Carraro, Giovanni; Lia, Cesario; Chiosi, Cesare
1997-01-01
In this paper we report on PD-SPH the new tree-sph code developed in Padua. The main features of the code are described and the results of a new and independent series of 1-D and 3-D tests are shown. The paper is mainly dedicated to the presentation of the code and to the critical discussion of its performances. In particular great attention is devoted to the convergency analysis. The code is highly adaptive in space and time by means of individual smoothing lengths and individual time steps. At present it contains both dark and baryonic matter, this latter in form of gas and stars, cooling, thermal conduction, star formation, and feed-back from Type I and II supernovae, stellar winds, and ultraviolet flux from massive stars, and finally chemical enrichment. New cooling rates that depend on the metal abundance of the interstellar medium are employed, and the differences with respect to the standard ones are outlined. Finally, we show the simulation of the dynamical and chemical evolution of a disk-like galaxy...
Energy Technology Data Exchange (ETDEWEB)
Queral, C.; Montero-Mayorga, J.; Gonzalez-Cadelo, J.
2013-07-01
The AP1000 PRA thermal hydraulic simulations were performed with MAAP code, which allows simulating sequences with low computational efforts. On the other hand, the use of best estimate codes allows verifying PRA results as well as obtaining a greater knowledge of the phenomenology of such sequences. The initiating event with the greatest contribution to core damage is Direct Vessel Injection Line Break (DVILB). This paper presents a review of DVILB sequences of AP1000 with TRACE code for verifying sequences previously analyzed by Westinghouse with MAAP code. The sequences which configure the DVILB event tree during short term have been simulated. The results obtained confirm the ones obtained in AP1000 PRA.
Visually Improved Image Compression by using Embedded Zero-tree Wavelet Coding
Directory of Open Access Journals (Sweden)
Janaki R
2011-03-01
Full Text Available Image compression is very important for efficient transmission and storage of images. Embedded Zero- tree Wavelet (EZW algorithm is a simple yet powerful algorithm having the property that the bits in the stream are generated in the order of their importance. Image compression can improve the performance of the digital systems by reducing time and cost in image storage and transmission without significant reduction of the image quality. For image compression it is desirable that the selection of transform should reduce the size of resultant data set as compared to source data set. EZW is computationally very fast and among the best image compression algorithm known today. This paper proposes a technique for image compression which uses the Wavelet-based Image Coding. A large number of experimental results are shown that this method saves a lot of bits in transmission, further enhances the compression performance. This paper aims to determine the best threshold to compress the still image at a particular decomposition level by using Embedded Zero-tree Wavelet encoder. Compression Ratio (CR and Peak-Signal-to-Noise (PSNR is determined for different threshold values ranging from 6 to 60 for decomposition level 8.
An Empirical Evaluation of Coding Methods for Multi-Symbol Alphabets.
Moffat, Alistair; And Others
1994-01-01
Evaluates the performance of different methods of data compression coding in several situations. Huffman's code, arithmetic coding, fixed codes, fast approximations to arithmetic coding, and splay coding are discussed in terms of their speed, memory requirements, and proximity to optimal performance. Recommendations for the best methods of…
Canbay, Ferhat; Levent, Vecdi Emre; Serbes, Gorkem; Ugurdag, H Fatih; Goren, Sezer; Aydin, Nizamettin
2016-09-01
The authors aimed to develop an application for producing different architectures to implement dual tree complex wavelet transform (DTCWT) having near shift-invariance property. To obtain a low-cost and portable solution for implementing the DTCWT in multi-channel real-time applications, various embedded-system approaches are realised. For comparison, the DTCWT was implemented in C language on a personal computer and on a PIC microcontroller. However, in the former approach portability and in the latter desired speed performance properties cannot be achieved. Hence, implementation of the DTCWT on a reconfigurable platform such as field programmable gate array, which provides portable, low-cost, low-power, and high-performance computing, is considered as the most feasible solution. At first, they used the system generator DSP design tool of Xilinx for algorithm design. However, the design implemented by using such tools is not optimised in terms of area and power. To overcome all these drawbacks mentioned above, they implemented the DTCWT algorithm by using Verilog Hardware Description Language, which has its own difficulties. To overcome these difficulties, simplify the usage of proposed algorithms and the adaptation procedures, a code generator program that can produce different architectures is proposed.
Conditional entropy coding of DCT coefficients for video compression
Sipitca, Mihai; Gillman, David W.
2000-04-01
We introduce conditional Huffman encoding of DCT run-length events to improve the coding efficiency of low- and medium-bit rate video compression algorithms. We condition the Huffman code for each run-length event on a classification of the current block. We classify blocks according to coding mode and signal type, which are known to the decoder, and according to energy, which the decoder must receive as side information. Our classification schemes improve coding efficiency with little or no increased running time and some increased memory use.
Al-Khaja, Nawal
2007-01-01
This is a thematic lesson plan for young learners about palm trees and the importance of taking care of them. The two part lesson teaches listening, reading and speaking skills. The lesson includes parts of a tree; the modal auxiliary, can; dialogues and a role play activity.
Energy Technology Data Exchange (ETDEWEB)
Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2017-05-01
It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through the analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.
Directory of Open Access Journals (Sweden)
Eric Psota
2010-01-01
Full Text Available The error mechanisms of iterative message-passing decoders for low-density parity-check codes are studied. A tutorial review is given of the various graphical structures, including trapping sets, stopping sets, and absorbing sets that are frequently used to characterize the errors observed in simulations of iterative decoding of low-density parity-check codes. The connections between trapping sets and deviations on computation trees are explored in depth using the notion of problematic trapping sets in order to bridge the experimental and analytic approaches to these error mechanisms. A new iterative algorithm for finding low-weight problematic trapping sets is presented and shown to be capable of identifying many trapping sets that are frequently observed during iterative decoding of low-density parity-check codes on the additive white Gaussian noise channel. Finally, a new method is given for characterizing the weight of deviations that result from problematic trapping sets.
Huffman, Jeffrey
2016-01-01
In his critique of the Huffman (2014) article, McLean (2016) undertakes an important reflective exercise that is too often missing in the field of second language acquisition and in the social sciences in general: questioning whether the claims made by researchers are warranted by their results. In this article, Jeffrey Huffman says that McLean…
Directory of Open Access Journals (Sweden)
Asral Bahari Jambek
2014-01-01
Full Text Available Wireless Sensor Networks (WSNs are becoming important in today’s technology in helping monitoring our surrounding environment. However, wireless sensor nodes are powered by limited energy supply. To extend the lifetime of the device, energy consumption must be reduced. Data transmission is known to consume the largest amount of energy in a sensor node. Thus, one method to reduce the energy used is by compressing the data before transmitting it. This study analyses the performance of the Huffman and Lempel-Ziv Welch (LZW algorithms when compressing data that are commonly used in WSN. From the experimental results, the Huffman algorithm gives a better performance when compared to the LZW algorithm for this type of data. The Huffman algorithm is able to reduce the data size by 43% on average, which is four times faster than the LZW algorithm.
Design and implementation of static Huffman encoding hardware using a parallel shifting algorithm
Tae Yeon Lee
2004-01-01
This paper discusses the implementation of static Huffman encoding hardware for real-time lossless compression for the electromagnetic calorimeter in the CMS experiment. The construction of the Huffman encoding hardware illustrates the implementation for optimizing the logic size. The number of logic gates in the parallel shift operation required for the hardware was examined. The experiment with a simulated environment and an FPGA shows that the real-time constraint has been fulfilled and the design of the buffer length is appropriate. (16 refs).
Henri Epstein
2016-01-01
An algebraic formalism, developed with V. Glaser and R. Stora for the study of the generalized retarded functions of quantum field theory, is used to prove a factorization theorem which provides a complete description of the generalized retarded functions associated with any tree graph. Integrating over the variables associated to internal vertices to obtain the perturbative generalized retarded functions for interacting fields arising from such graphs is shown to be possible for a large cate...
Epstein, Henri
2016-01-01
An algebraic formalism, developped with V. Glaser and R. Stora for the study of the generalized retarded functions of quantum field theory, is used to prove a factorization theorem which provides a complete description of the generalized retarded functions associated with any tree graph. Integrating over the variables associated to internal vertices to obtain the perturbative generalized retarded functions for interacting fields arising from such graphs is shown to be possible for a large cat...
Epstein, Henri
2016-01-01
An algebraic formalism, developped with V.~Glaser and R.~Stora for the study of the generalized retarded functions of quantum field theory, is used to prove a factorization theorem which provides a complete description of the generalized retarded functions associated with any tree graph. Integrating over the variables associated to internal vertices to obtain the perturbative generalized retarded functions for interacting fields arising from such graphs is shown to be possible for a large category of space-times.
A modified parallel tree code for N-body simulation of the Large Scale Structure of the Universe
Becciani, U
2000-01-01
N-body codes to perform simulations of the origin and evolution of the Large Scale Structure of the Universe have improved significantly over the past decade both in terms of the resolution achieved and of reduction of the CPU time. However, state-of-the-art N-body codes hardly allow one to deal with particle numbers larger than a few 10^7, even on the largest parallel systems. In order to allow simulations with larger resolution, we have first re-considered the grouping strategy as described in Barnes (1990) (hereafter B90) and applied it with some modifications to our WDSH-PT (Work and Data SHaring - Parallel Tree) code. In the first part of this paper we will give a short description of the code adopting the Barnes and Hut algorithm \\cite{barh86} (hereafter BH), and in particular of the memory and work distribution strategy applied to describe the {\\it data distribution} on a CC-NUMA machine like the CRAY-T3E system. In the second part of the paper we describe the modification to the Barnes grouping strate...
Rate-adaptive Constellation Shaping for Near-capacity Achieving Turbo Coded BICM
DEFF Research Database (Denmark)
Yankov, Metodi Plamenov; Forchhammer, Søren; Larsen, Knud J.
2014-01-01
In this paper the problem of constellation shaping is considered. Mapping functions are designed for a many- to-one signal shaping strategy, combined with a turbo coded Bit-interleaved Coded Modulation (BICM), based on symmetric Huffman codes with binary reflected Gray-like properties. An algorit...
Implementation of a tree algorithm in MCNP code for nuclear well logging applications.
Li, Fusheng; Han, Xiaogang
2012-07-01
The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length.
Shi, Fei; Wang, Beibei; Selesnick, Ivan W.; Wang, Yao
2006-01-01
This paper introduces an anisotropic decomposition structure of a recently introduced 3-D dual-tree discrete wavelet transform (DDWT), and explores the applications for video denoising and coding. The 3-D DDWT is an attractive video representation because it isolates motion along different directions in separate subbands, and thus leads to sparse video decompositions. Our previous investigation shows that the 3-D DDWT, compared to the standard discrete wavelet transform (DWT), complies better with the statistical models based on sparse presumptions, and gives better visual and numerical results when used for statistical denoising algorithms. Our research on video compression also shows that even with 4:1 redundancy, the 3-D DDWT needs fewer coefficients to achieve the same coding quality (in PSNR) by applying the iterative projection-based noise shaping scheme proposed by Kingsbury. The proposed anisotropic DDWT extends the superiority of isotropic DDWT with more directional subbands without adding to the redundancy. Unlike the original 3-D DDWT which applies dyadic decomposition along all three directions and produces isotropic frequency spacing, it has a non-uniform tiling of the frequency space. By applying this structure, we can improve the denoising results, and the number of significant coefficients can be reduced further, which is beneficial for video coding.
Optimal source codes for geometrically distributed integer alphabets
Gallager, R. G.; Van Voorhis, D. C.
1975-01-01
An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.
24.77 Pflops on a Gravitational Tree-Code to Simulate the Milky Way Galaxy with 18600 GPUs
Bédorf, Jeroen; Fujii, Michiko S; Nitadori, Keigo; Ishiyama, Tomoaki; Zwart, Simon Portegies
2014-01-01
We have simulated, for the first time, the long term evolution of the Milky Way Galaxy using 51 billion particles on the Swiss Piz Daint supercomputer with our $N$-body gravitational tree-code Bonsai. Herein, we describe the scientific motivation and numerical algorithms. The Milky Way model was simulated for 6 billion years, during which the bar structure and spiral arms were fully formed. This improves upon previous simulations by using 1000 times more particles, and provides a wealth of new data that can be directly compared with observations. We also report the scalability on both the Swiss Piz Daint and the US ORNL Titan. On Piz Daint the parallel efficiency of Bonsai was above 95%. The highest performance was achieved with a 242 billion particle Milky Way model using 18600 GPUs on Titan, thereby reaching a sustained GPU and application performance of 33.49 Pflops and 24.77 Pflops respectively.
Design and implementation for static Huffman encoding hardware with parallel shifting algorithm
Tae Yeon Lee
2004-01-01
This paper presents an implementation of static Huffman encoding hardware for real-time lossless compression in the ECAL of the CMS detector. The construction of the Huffman encoding hardware shows an implementation for optimizing its logic size. The number of logic gates of the parallel shift operation for the hardware is analyzed. Two kinds of implementation methods of the parallel shift operation are compared in aspect of logic size. The experiment with the hardware on a simulated ECAL environment covering 99.9999% of original distribution shows promising result with the simulation that the compression rate was 4.0039 and the maximum length of the stored data in the input buffer was 44. (14 refs).
Baer, Michael B
2008-01-01
Huffman coding finds an optimal prefix code for a given probability mass function. Consider situations in which one wishes to find an optimal code with the restriction that all codewords have lengths that lie in a user-specified set of lengths (or, equivalently, no codewords have lengths that lie in a complementary set). This paper introduces a polynomial-time dynamic programming algorithm that finds optimal codes for this reserved-length prefix coding problem. This has applications to quickly encoding and decoding lossless codes. In addition, one modification of the approach solves any quasiarithmetic prefix coding problem, while another finds optimal codes restricted to the set of codes with g codeword lengths for user-specified g (e.g., g=2).
Huffman和S-DES混合加密算法的研究%Analysis of Huffman and S-DES of Mixed Encryption Algorithm
Institute of Scientific and Technical Information of China (English)
郑静; 王腾
2014-01-01
In contrast to the existing common encryption software and classical cryptography, combined with the present situa-tion and development of the current text encryption, this paper will be based on dynamic Huffman coding and S-DES algo-rithm, make up for the shortcomings of the two, achieve the best effect on text information encryption.%在对比现有的加密软件和古典密码学常见的加密算法后，结合文本加密的现状及发展趋势，该文将基于动态Huff-man编码和S-DES算法相结合，弥补两者的缺点，达到对文本信息的最佳加密及解密效果。
安全的LZW编码算法及其在GIF图像加密中的应用%Secure LZW coding algorithm and its application in GIF image encryption
Institute of Scientific and Technical Information of China (English)
向涛; 王安
2012-01-01
This paper proposed a Secure LZW (SLZW) coding algorithm, where encryption was embedded into the improved LZW coding process, and SLZW can fulfill compression and encryption in a single step. In SLZW algorithm, dynamic Huffman tree was utilized to code the dictionary of LZW, and the initialization and updating of Huffman tree were controlled by a sequence of keystream generated by Coupled Map Lattcie (CML). The code words were further XORed with the keystream to generate the ciphertext. The SLZW was applied to GIF image encryption. The experimental results and their analyses indicate that the proposed SLZW algorithm not only has good security, but can also improves the compression ratio by about 10% . Therefore, SLZW can find its wide applications in practice.%提出了一种安全的LZW编码算法——SLZW.该算法在改进的LZW编码过程中嵌入加密,从而能够同时完成压缩和加密.SLZW编码利用动态Huffman树作为LZW的字典,并且通过耦合映像格子(CML)产生的密钥流对字典的构建和更新进行控制,编码输出进一步和密钥流进行异或后产生密文.并且,该算法被应用于GIF图像加密中,实验结果和分析表明,该算法不仅具有较好的安全性,同时也将标准LZW算法的压缩效率提高了10％左右,具有广泛的实用性.
Modified Huffman Code and Its Applications%改进的Huffman编码及其应用
Institute of Scientific and Technical Information of China (English)
武善玉; 晏振鸣
2009-01-01
该文探讨了JPEG压缩技术,重点针对Htuffman编码中最优二又树的"形态"不唯一问题,提出一种基于"简单原则"的新方法.经过这种方法改进的Huffman编码,使得JPEG中相应的值或字符的Huffman编码是唯一的.与传统的Huffman算法及近年来国内外文献中提出的改进算法相比,该方法编码步骤和相关操作更简洁,因而更利于程序的实现和移植.最后给出一个实例,表明此方法的实用性.
关于Huffman编码的一个注记%A Note on Huffman Coding
Institute of Scientific and Technical Information of China (English)
林嘉宇; 刘荧
2003-01-01
Huffman编码是无损压缩中的重要方法,在数据压缩、音频编码、图像编码中得到广泛的应用.除了压缩效率以外,作为变长码的Huffman编码,还有其他的判断其编码优劣的准则,例如码方差、抗误码的能力等.本文讨论Huffman编码后的码流中0、1码元(二进制情况下)出现的概率问题.研究结果表明,通常的经典Huffman编码的0、1码元出现的概率差最大,在出现概率均衡准则下的性能最劣.文章进行了严格的数学建模,并给出了一种算法,可以使编码后码流中0、1码元的分布概率(趋向)均等;并且,算法可在原Huffman编码中结合进行,所增加的计算量很小.文章最后进行了实验验证.
Huffman编码的另类算法%Huffman Code Algorithm Using Other Way
Institute of Scientific and Technical Information of China (English)
王敏; 刘洋
2006-01-01
本文从Huffman树的"原始"构造及其编码算法出发,分析影响其算法性能的因素,介绍了Canonical Huffman编码.从提高算法性能的角度,利用Canonical Huffman编码规则改进"原始"算法,并提出新的算法及其实例.
用Perl语言实现Huffman编码%IMPLEMENTATION OF HUFFMAN CODES BY PERL PROGRAMMING
Institute of Scientific and Technical Information of China (English)
刘学军
2006-01-01
Perl是一种功能强大的编程语言.Huffman编码是压缩文件的一种常用算法.采用Perl语言编程来产生Huffman编码,并阐述了用Perl编写此程序的基本思想及其数据类型的使用技巧.最后根据此程序的输出结果,简要讨论并分析了Huffman算法对文件的压缩率随字符种类及其出现频率的变化规律.
Implement of Huffman code in Matlab%Matlab下实现huffman编码
Institute of Scientific and Technical Information of China (English)
吴记群; 李双科
2006-01-01
在matlab中模拟C中链表,利用复数运算,联系具体字符和概率,每次找到最小概率的两个字符对应的编号,依次记录下来,最后根据奇偶码的不同实现Huffman编码.本算法新颖独特,易于理解、编程.
Energy Technology Data Exchange (ETDEWEB)
Berberich, Benjamin
2012-03-15
Processes in the plasma edge layer of magnetic fusion devices occur on widely disparate length- and time-scales. Also recently developed features in this particular region, such as stochastic magnetic fields, underline the necessity for three dimensional, full-kinetic simulation tools. Contemporary programs often deploy ad hoc assumptions and approximations for microscopic phenomena for which self-consistent ab initio models in principle exist, but are still computationally too expensive or complex to implement. Recently, mesh-free methods have matured into a new class of tools for such first-principles computations which thanks to their geometric flexibility are highly promising for tackling complicated TOKAMAK regions. In this work we have develop the massively parallel Tree-Code PEPC-B (Pretty Efficient Parallel Coulomb solver) into a new tool for plasma material interaction studies. After a brief overview of the working principles of Tree-Codes two main topic groups are addressed: First the leap-frog Boris integration scheme is discussed and its numerical limitations are pointed out. To overcome these limitations the method is enhanced to a guiding-center integrator. As a proof of principal, numerical experiments are conducted reproducing the anticipated drift kinetic aspects of particle orbits. It turns out that this new technique is much less sensitive to large time steps than the original concept was. One major drawback of mesh-free methods which hinders their direct use for plasma-edge simulations is the difficulty in representing solid structures and associated boundary conditions. Therefore, an alternative concept is proposed using charge carrying Wall-Particles, which fits naturally in the mesh-free doctrine. These developments incorporate the second main topic group of this report. To prove the physical correctness of this new idea, a quasi one dimensional plasma-wall interface scenario is chosen. By studying the system with great detail, good agreement
Institute of Scientific and Technical Information of China (English)
居美艳; 葛欣; 李岳衡; 谭国平
2013-01-01
For the MIMO channels with space correlation and time correlation, a novel joint space-time Huffman limited feedback precoding scheme was proposed which improves the system performance and reduces the amount of feedback. Based on space correlation, the precoding structure under zero-forcing (ZF) criterion was derived and the rotating quan-tization codebook was designed which reduces the effect of space correlation on system performance. In addition, in view of time correlation of channels, the scheme reduces the feedback data of channel state information (CSI) in the slow fad-ing channel by using neighborhood-based limited feedback. Due to different probabilities of codewords in the neighbor-hood, Huffman coding was adopted to further reduce the amount of feedback.%针对空时相关的 MIMO 信道，提出了一种新颖的 Huffman 空时联合有限反馈预编码方法，提高了系统性能，并减少了反馈量。从信道的空间相关性出发，推导了迫零准则下预编码的构成，从而设计了一种旋转量化码本，减小了空间相关性对系统性能的影响。另外，针对信道的时间相关性，利用基于邻域的有限反馈来降低慢衰落信道的反馈量。同时，由于领域内各码字被选中的概率不同，可以利用Huffman编码进一步减少反馈量。
Huang, Xiao-Yan; Li, Ming-Li; Xu, Juan; Gao, Yue-Dong; Wang, Wen-Guang; Yin, An-Guo; Li, Xiao-Fei; Sun, Xiao-Mei; Xia, Xue-Shan; Dai, Jie-Jie
2013-04-01
While the tree shrew (Tupaia belangeri chinensis) is an excellent animal model for studying the mechanisms of human diseases, but few studies examine interleukin-2 (IL-2), an important immune factor in disease model evaluation. In this study, a 465 bp of the full-length IL-2 cDNA encoding sequence was cloned from the RNA of tree shrew spleen lymphocytes, which were then cultivated and stimulated with ConA (concanavalin). Clustal W 2.0 was used to compare and analyze the sequence and molecular characteristics, and establish the similarity of the overall structure of IL-2 between tree shrews and other mammals. The homology of the IL-2 nucleotide sequence between tree shrews and humans was 93%, and the amino acid homology was 80%. The phylogenetic tree results, derived through the Neighbour-Joining method using MEGA5.0, indicated a close genetic relationship between tree shrews, Homo sapiens, and Macaca mulatta. The three-dimensional structure analysis showed that the surface charges in most regions of tree shrew IL-2 were similar to between tree shrews and humans; however, the N-glycosylation sites and local structures were different, which may affect antibody binding. These results provide a fundamental basis for the future study of IL-2 monoclonal antibody in tree shrews, thereby improving their utility as a model.
Rate-adaptive Constellation Shaping for Near-capacity Achieving Turbo Coded BICM
DEFF Research Database (Denmark)
Yankov, Metodi Plamenov; Forchhammer, Søren; Larsen, Knud J.;
2014-01-01
In this paper the problem of constellation shaping is considered. Mapping functions are designed for a many- to-one signal shaping strategy, combined with a turbo coded Bit-interleaved Coded Modulation (BICM), based on symmetric Huffman codes with binary reflected Gray-like properties. An algorithm...... is derived for finding the Huffman code with such properties for a variety of alphabet sizes, and near-capacity performance is achieved for a wide SNR region by dynamically choosing the optimal code rate, constellation size and mapping function based on the operating SNR point and assuming perfect channel...... quality estimation. Gains of more than 1dB are observed for high SNR compared to conventional turbo coded BICM, and it is shown that the mapping functions designed here significantly outperform current state of the art Turbo- Trellis Coded Modulation and other existing constellation shaping methods...
Karl, Simon J.; Aarseth, Sverre J.; Naab, Thorsten; Haehnelt, Martin G.; Spurzem, Rainer
2015-09-01
We present a hybrid code combining the OpenMP-parallel tree code VINE with an algorithmic chain regularization scheme. The new code, called `rVINE', aims to significantly improve the accuracy of close encounters of massive bodies with supermassive black holes (SMBHs) in galaxy-scale numerical simulations. We demonstrate the capabilities of the code by studying two test problems, the sinking of a single massive black hole to the centre of a gas-free galaxy due to dynamical friction and the hardening of an SMBH binary due to close stellar encounters. We show that results obtained with rVINE compare well with NBODY7 for problems with particle numbers that can be simulated with NBODY7. In particular, in both NBODY7 and rVINE we find a clear N-dependence of the binary hardening rate, a low binary eccentricity and moderate eccentricity evolution, as well as the conversion of the galaxy's inner density profile from a cusp to a core via the ejection of stars at high velocity. The much larger number of particles that can be handled by rVINE will open up exciting opportunities to model stellar dynamics close to SMBHs much more accurately in a realistic galactic context. This will help to remedy the inherent limitations of commonly used tree solvers to follow the correct dynamical evolution of black holes in galaxy-scale simulations.
Rodionov, Anatoly
2007-01-01
A new incremental algorithm for data compression is presented. For a sequence of input symbols algorithm incrementally constructs a p-adic integer number as an output. Decoding process starts with less significant part of a p-adic integer and incrementally reconstructs a sequence of input symbols. Algorithm is based on certain features of p-adic numbers and p-adic norm. p-adic coding algorithm may be considered as of generalization a popular compression technique - arithmetic coding algorithms. It is shown that for p = 2 the algorithm works as integer variant of arithmetic coding; for a special class of models it gives exactly the same codes as Huffman's algorithm, for another special model and a specific alphabet it gives Golomb-Rice codes.
Modified symmetrical reversible variable length code and its theoretical bounds
Tsai, Chien-Wu; Wu, Ja-Ling; Liu, Shu-Wei
2000-04-01
The reversible variable length codes (RVLCs) have been adopted in the emerging video coding standards -- H.263+ and MPEG- 4, to enhance their error-resilience capability which is important and essential in the error-prone environments. The most appealing advantage of symmetrical RVLCs compared with asymmetrical RVLCs is that only one code table is required to forward and backward decoding, however, two code tables are required for asymmetrical RVLCs. In this paper, we propose a simple and efficient algorithm that can produce a symmetrical RVLC from a given Huffman code, and we also discuss theoretical bounds of the proposed symmetrical RVLCs.
Valdivia, Valeska
2014-01-01
Context. Ultraviolet radiation plays a crucial role in molecular clouds. Radiation and matter are tightly coupled and their interplay influences the physical and chemical properties of gas. In particular, modeling the radiation propagation requires calculating column densities, which can be numerically expensive in high-resolution multidimensional simulations. Aims. Developing fast methods for estimating column densities is mandatory if we are interested in the dynamical influence of the radiative transfer. In particular, we focus on the effect of the UV screening on the dynamics and on the statistical properties of molecular clouds. Methods. We have developed a tree-based method for a fast estimate of column densities, implemented in the adaptive mesh refinement code RAMSES. We performed numerical simulations using this method in order to analyze the influence of the screening on the clump formation. Results. We find that the accuracy for the extinction of the tree-based method is better than 10%, while the ...
The Permutation Groups and the Equivalence of Cyclic and Quasi-Cyclic Codes
Guenda, Kenza
2010-01-01
We give the class of finite groups which arise as the permutation groups of cyclic codes over finite fields. Furthermore, we extend the results of Brand and Huffman et al. and we find the properties of the set of permutations by which two cyclic codes of length p^r can be equivalent. We also find the set of permutations by which two quasi-cyclic codes can be equivalent.
The Optimal Fix-Free Code for Anti-Uniform Sources
Directory of Open Access Journals (Sweden)
Ali Zaghian
2015-03-01
Full Text Available An \\(n\\ symbol source which has a Huffman code with codelength vector \\(L_{n}=(1,2,3,\\cdots,n-2,n-1,n-1\\ is called an anti-uniform source. In this paper, it is shown that for this class of sources, the optimal fix-free code and symmetric fix-free code is \\( C_{n}^{*}=(0,11,101,1001,\\cdots,1\\overbrace{0\\cdots0}^{n-2}1.
On the optimality of code options for a universal noiseless coder
Yeh, Pen-Shu; Rice, Robert F.; Miller, Warner
1991-01-01
A universal noiseless coding structure was developed that provides efficient performance over an extremely broad range of source entropy. This is accomplished by adaptively selecting the best of several easily implemented variable length coding algorithms. Custom VLSI coder and decoder modules capable of processing over 20 million samples per second are currently under development. The first of the code options used in this module development is shown to be equivalent to a class of Huffman code under the Humblet condition, other options are shown to be equivalent to the Huffman codes of a modified Laplacian symbol set, at specified symbol entropy values. Simulation results are obtained on actual aerial imagery, and they confirm the optimality of the scheme. On sources having Gaussian or Poisson distributions, coder performance is also projected through analysis and simulation.
Karl, S J; Naab, T; Haehnelt, M G; Spurzem, R
2015-01-01
We present a hybrid code combining the OpenMP-parallel tree code VINE with an algorithmic chain regularization scheme. The new code, called "rVINE", aims to significantly improve the accuracy of close encounters of massive bodies with supermassive black holes in galaxy-scale numerical simulations. We demonstrate the capabilities of the code by studying two test problems, the sinking of a single massive black hole to the centre of a gas-free galaxy due to dynamical friction and the hardening of a supermassive black hole binary due to close stellar encounters. We show that results obtained with rVINE compare well with NBODY7 for problems with particle numbers that can be simulated with NBODY7. In particular, in both NBODY7 and rVINE we find a clear N-dependence of the binary hardening rate, a low binary eccentricity and moderate eccentricity evolution, as well as the conversion of the galaxy's inner density profile from a cusp to a a core via the ejection of stars at high velocity. The much larger number of par...
Coding Long Contour Shapes of Binary Objects
Sánchez-Cruz, Hermilo; Rodríguez-Díaz, Mario A.
This is an extension of the paper appeared in [15]. This time, we compare four methods: Arithmetic coding applied to 3OT chain code (Arith-3OT), Arithmetic coding applied to DFCCE (Arith-DFCCE), Huffman coding applied to DFCCE chain code (Huff-DFCCE), and, to measure the efficiency of the chain codes, we propose to compare the methods with JBIG, which constitutes an international standard. In the aim to look for a suitable and better representation of contour shapes, our probes suggest that a sound method to represent contour shapes is 3OT, because Arithmetic coding applied to it gives the best results regarding JBIG, independently of the perimeter of the contour shapes.
Improving the efficiency of the genetic code by varying the codon length--the perfect genetic code.
Doig, A J
1997-10-07
The function of DNA is to specify protein sequences. The four-base "alphabet" used in nucleic acids is translated to the 20 base alphabet of proteins (plus a stop signal) via the genetic code. The code is neither overlapping nor punctuated, but has mRNA sequences read in successive triplet codons until reaching a stop codon. The true genetic code uses three bases for every amino acid. The efficiency of the genetic code can be significantly increased if the requirement for a fixed codon length is dropped so that the more common amino acids have shorter codon lengths and rare amino acids have longer codon lengths. More efficient codes can be derived using the Shannon-Fano and Huffman coding algorithms. The compression achieved using a Huffman code cannot be improved upon. I have used these algorithms to derive efficient codes for representing protein sequences using both two and four bases. The length of DNA required to specify the complete set of protein sequences could be significantly shorter if transcription used a variable codon length. The restriction to a fixed codon length of three bases means that it takes 42% more DNA than the minimum necessary, and the genetic code is 70% efficient. One can think of many reasons why this maximally efficient code has not evolved: there is very little redundancy so almost any mutation causes an amino acid change. Many mutations will be potentially lethal frame-shift mutations, if the mutation leads to a change in codon length. It would be more difficult for the machinery of transcription to cope with a variable codon length. Nevertheless, in the strict and narrow sense of coding for protein sequences using the minimum length of DNA possible, the Huffman code derived here is perfect.
A Generic Top-Down Dynamic-Programming Approach to Prefix-Free Coding
Golin, Mordecai; Yu, Jiajin
2008-01-01
Given a probability distribution over a set of n words to be transmitted, the Huffman Coding problem is to find a minimal-cost prefix free code for transmitting those words. The basic Huffman coding problem can be solved in O(n log n) time but variations are more difficult. One of the standard techniques for solving these variations utilizes a top-down dynamic programming approach. In this paper we show that this approach is amenable to dynamic programming speedup techniques, permitting a speedup of an order of magnitude for many algorithms in the literature for such variations as mixed radix, reserved length and one-ended coding. These speedups are immediate implications of a general structural property that permits batching together the calculation of many DP entries.
Finding maximum JPEG image block code size
Lakhani, Gopal
2012-07-01
We present a study of JPEG baseline coding. It aims to determine the minimum storage needed to buffer the JPEG Huffman code bits of 8-bit image blocks. Since DC is coded separately, and the encoder represents each AC coefficient by a pair of run-length/AC coefficient level, the net problem is to perform an efficient search for the optimal run-level pair sequence. We formulate it as a two-dimensional, nonlinear, integer programming problem and solve it using a branch-and-bound based search method. We derive two types of constraints to prune the search space. The first one is given as an upper-bound for the sum of squares of AC coefficients of a block, and it is used to discard sequences that cannot represent valid DCT blocks. The second type constraints are based on some interesting properties of the Huffman code table, and these are used to prune sequences that cannot be part of optimal solutions. Our main result is that if the default JPEG compression setting is used, space of minimum of 346 bits and maximum of 433 bits is sufficient to buffer the AC code bits of 8-bit image blocks. Our implementation also pruned the search space extremely well; the first constraint reduced the initial search space of 4 nodes down to less than 2 nodes, and the second set of constraints reduced it further by 97.8%.
Walter, Pierre
2012-01-01
This study examines how cultural codes in environmental adult education can be used to "frame" collective identity, develop counterhegemonic ideologies, and catalyse "educative-activism" within social movements. Three diverse examples are discussed, spanning environmental movements in urban Victoria, British Columbia, Canada,…
Martin-Fernandez, Marcos; Alberola-Lopez, Carlos; Guerrero-Rodriguez, David; Ruiz-Alzola, Juan
2000-12-01
In this paper we propose a novel lossless coding scheme for medical images that allows the final user to switch between a lossy and a lossless mode. This is done by means of a progressive reconstruction philosophy (which can be interrupted at will) so we believe that our scheme gives a way to trade off between the accuracy needed for medical diagnosis and the information reduction needed for storage and transmission. We combine vector quantization, run-length bit plane and entropy coding. Specifically, the first step is a vector quantization procedure; the centroid codes are Huffman- coded making use of a set of probabilities that are calculated in the learning phase. The image is reconstructed at the coder in order to obtain the error image; this second image is divided in bit planes, which are then run-length and Huffman coded. A second statistical analysis is performed during the learning phase to obtain the parameters needed in this final stage. Our coder is currently trained for hand-radiographs and fetal echographies. We compare our results for this two types of images to classical results on bit plane coding and the JPEG standard. Our coder turns out to outperform both of them.
Wavelet based hierarchical coding scheme for radar image compression
Sheng, Wen; Jiao, Xiaoli; He, Jifeng
2007-12-01
This paper presents a wavelet based hierarchical coding scheme for radar image compression. Radar signal is firstly quantized to digital signal, and reorganized as raster-scanned image according to radar's repeated period frequency. After reorganization, the reformed image is decomposed to image blocks with different frequency band by 2-D wavelet transformation, each block is quantized and coded by the Huffman coding scheme. A demonstrating system is developed, showing that under the requirement of real time processing, the compression ratio can be very high, while with no significant loss of target signal in restored radar image.
Valdivia, Valeska; Hennebelle, Patrick
2014-11-01
Context. Ultraviolet radiation plays a crucial role in molecular clouds. Radiation and matter are tightly coupled and their interplay influences the physical and chemical properties of gas. In particular, modeling the radiation propagation requires calculating column densities, which can be numerically expensive in high-resolution multidimensional simulations. Aims: Developing fast methods for estimating column densities is mandatory if we are interested in the dynamical influence of the radiative transfer. In particular, we focus on the effect of the UV screening on the dynamics and on the statistical properties of molecular clouds. Methods: We have developed a tree-based method for a fast estimate of column densities, implemented in the adaptive mesh refinement code RAMSES. We performed numerical simulations using this method in order to analyze the influence of the screening on the clump formation. Results: We find that the accuracy for the extinction of the tree-based method is better than 10%, while the relative error for the column density can be much more. We describe the implementation of a method based on precalculating the geometrical terms that noticeably reduces the calculation time. To study the influence of the screening on the statistical properties of molecular clouds we present the probability distribution function of gas and the associated temperature per density bin and the mass spectra for different density thresholds. Conclusions: The tree-based method is fast and accurate enough to be used during numerical simulations since no communication is needed between CPUs when using a fully threaded tree. It is then suitable to parallel computing. We show that the screening for far UV radiation mainly affects the dense gas, thereby favoring low temperatures and affecting the fragmentation. We show that when we include the screening, more structures are formed with higher densities in comparison to the case that does not include this effect. We
Second Generation Wavelet Applied to Lossless Compression Coding of Image%第二代小波应用于图象的无损压缩编码
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
In this paper, the second generation wavelet transform is applied to image lossless coding, according to its characteristic of reversible integer wavelet transform. The second generation wavelet transform can provide higher compression ratio than Huffman coding while it reconstructs image without loss compared with the first generation wavelet transform. The experimental results show that the second generation wavelet transform can obtain excellent performance in medical image compression coding.
Minimum Redundancy Coding for Uncertain Sources
Baer, Michael B; Charalambous, Charalambos D
2011-01-01
Consider the set of source distributions within a fixed maximum relative entropy with respect to a given nominal distribution. Lossless source coding over this relative entropy ball can be approached in more than one way. A problem previously considered is finding a minimax average length source code. The minimizing players are the codeword lengths --- real numbers for arithmetic codes, integers for prefix codes --- while the maximizing players are the uncertain source distributions. Another traditional minimizing objective is the first one considered here, maximum (average) redundancy. This problem reduces to an extension of an exponential Huffman objective treated in the literature but heretofore without direct practical application. In addition to these, this paper examines the related problem of maximal minimax pointwise redundancy and the problem considered by Gawrychowski and Gagie, which, for a sufficiently small relative entropy ball, is equivalent to minimax redundancy. One can consider both Shannon-...
Improved EZW image coding algorithm in air traffic control system.%空管系统中运用EZW图像编码算法的优化策略
Institute of Scientific and Technical Information of China (English)
胡波; 杨红雨
2011-01-01
In order to efficiently transmit the picture data in the air traffic control system, an improved image coding algoRithm, which is based on Embedded Zero-tree Wavelet(EZW),is presented. At fist, the lowest frequency sub-band is coded without loss. Then, based on the human visual system, different sub-bands at the same level based on different perceptual weights are merged. In the end. Huffman coding is used for EZW result. The experiment result shows that this method is better than the original EZW algorithm.%为了在空管系统中高效率传输景象数据,在嵌入式零树小波图像编码算法(EZW)的基础上进行了按步骤的整体优化,对低频子带进行无损编码,对同一级的不同子带根据人眼视觉特性进行加权合并,用Huffman编码对符号流进行二次编码,实验结果表明,该方法要优于传统的EZW算法.
Based on the MATLAB design of Huffman coding%基于MATLAB的哈夫曼编码设计
Institute of Scientific and Technical Information of China (English)
林寿光
2010-01-01
利用哈夫曼压缩编码的原理及方法,采用MATLAB软件对两幅图片进行压缩编码程序设计,获得压缩信息及哈夫曼编码表,分析压缩后的图像像素数据及压缩比.结果表明,哈夫曼编码是一种无损压缩编码.
语音PCM的Huffman编码研究与实现%Realization of PCM to huffman coding for voice
Institute of Scientific and Technical Information of China (English)
邓翔宇
2010-01-01
传统的模拟语音PCM采用等长折叠二进制编码,其数码率较高,传输和处理所需系统资源较大.文章从语音信号抽样值的概率分布情况出发,在PCM编码的非均匀量化基础上,对13折线A律压扩特性采用变长编码,使信源的熵冗余得以减小,实现了语音MOS值不变情况下的压缩编码,同时,又运用EDA技术对压缩电路进行了基于CPLD的硬件设计.
New IP traceback scheme based on Huffman codes%新的基于Huffman编码的追踪方案
Institute of Scientific and Technical Information of China (English)
罗莉莉; 谢冬青; 占勇军; 周再红
2007-01-01
面对DDoS攻击,研究人员提出了各种IP追踪技术寻找攻击包的真实源IP地址,但是目前的追踪方案存在着标记过程中的空间问题、追踪源是否准确及追踪所需包的数量等问题.提出一种新的基于Huffman编码的追踪方案,可以节省大量的存储空间,提高空间效率,而且当遭遇DoS(拒绝服务攻击)和DDoS的攻击时能迅速作出反应,仅仅收到一个攻击包即可重构出攻击路径,找到准确的攻击源, 将攻击带来的危害和损失减小到最低程度.
基于雷达视频的Huffman编码研究%Research on Huffman Code of Radar Vedio
Institute of Scientific and Technical Information of China (English)
韩菲
2004-01-01
讨论在雷达视频传输中所运用到的数据压缩算法,论述采用霍夫曼码对雷达数据进行编解码,以解决大容量雷达数据传输,满足雷达视频图像数据实时、高速、无损传输的要求.
赫夫曼编码的求解算法%Improving Algorithm for Finding Huffman-codes
Institute of Scientific and Technical Information of China (English)
徐凤生; 钱爱增; 李海军; 李天志
2007-01-01
最优二叉树是一种十分重要的数据结构,在通信、工程及软件开发等领域有着广泛的应用.文中对最优二叉树进行探讨的基础上,通过改进最优二叉树和Huffman编码的存储结构,提出了一种求赫夫曼编码的求解算法.通过设计相应的C语言程序验证了算法的有效性.
Ultraspectral sounder data compression using the Tunstall coding
Wei, Shih-Chieh; Huang, Bormin; Gu, Lingjia
2007-09-01
In an error-prone environment the compression of ultraspectral sounder data is vulnerable to error propagation. The Tungstall coding is a variable-to-fixed length code which compresses data by mapping a variable number of source symbols to a fixed number of codewords. It avoids the resynchronization difficulty encountered in fixed-to-variable length codes such as Huffman coding and arithmetic coding. This paper explores the use of the Tungstall coding in reducing the error propagation for ultraspectral sounder data compression. The results show that our Tunstall approach has a favorable compression ratio compared with JPEG-2000, 3D SPIHT, JPEG-LS, CALIC and CCSDS IDC 5/3. It also has less error propagation compared with JPEG-2000.
Lossless quantum coding in many-letter spaces
Boström, K J
2000-01-01
Based on the concept of many-letter theory a general characterization of quantum codes using the Kraus representation is given. Compression codes are defined by their property of decreasing the average information content of a given a priori message ensemble. Lossless quantum codes, in contrast to lossy codes, provide the retrieval of the original input states with perfect fidelity. A general lossless coding scheme is given that translates between two quantum alphabets. It is shown that this scheme is never compressive. Furthermore, a lossless quantum coding scheme, analog to the classical Huffman scheme but different from the Braunstein scheme, is implemented, which provides optimal compression. Motivated by the concept of lossless quantum compression, an observable is defined that measures the amount of compressible quantum information contained in a particular message with respect to a given \\emph{a priori} message ensemble. The average of this observable yields the von Neumann entropy, which is finally es...
Joint source channel coding using arithmetic codes
Bi, Dongsheng
2009-01-01
Based on the encoding process, arithmetic codes can be viewed as tree codes and current proposals for decoding arithmetic codes with forbidden symbols belong to sequential decoding algorithms and their variants. In this monograph, we propose a new way of looking at arithmetic codes with forbidden symbols. If a limit is imposed on the maximum value of a key parameter in the encoder, this modified arithmetic encoder can also be modeled as a finite state machine and the code generated can be treated as a variable-length trellis code. The number of states used can be reduced and techniques used fo
A dynamical systems proof of Kraft-McMillan inequality and its converse for prefix-free codes
Nagaraj, Nithin
2009-03-01
Uniquely decodable codes are central to lossless data compression in both classical and quantum communication systems. The Kraft-McMillan inequality is a basic result in information theory which gives a necessary and sufficient condition for a code to be uniquely decodable and also has a quantum analogue. In this letter, we provide a novel dynamical systems proof of this inequality and its converse for prefix-free codes (no codeword is a prefix of another—the popular Huffman codes are an example). For constrained sources, the problem is still open.
Institute of Scientific and Technical Information of China (English)
钟晨峰; 李斌桥; 徐江涛
2012-01-01
为了解决图像传感器暗电流消除过程中数据存储问题,本文提出了一种基于DPCM- Huffman压缩算法的数据压缩去暗电流系统并进行硬件实现;在该系统工作之前,首先通过DPCM与Huffman组合压缩算法将图像传感器暗电流数据进行编码压缩,并将压缩后的数据存储于FLASH存储器中.而后在图像传感器工作过程中,通过读取存储器中数据,进行Huffman与DPCM解码,最终消除图像传感器中的暗电流.实验证明,采用该压缩去暗电流系统处理后,以分辨率为256×256的CMOS图像传感器为例,压缩后数据压缩比为3.12,数据量降为原始数据的32％,提高3倍的工作速度.实践证明,本文提出的解压系统提高了数据压缩比,保证了数据精度,提高了图像传感器的工作速度,是一种适用于CMOS图像传感器暗电流消除的压缩系统.%To reliably realize the data storage during the dark current elimination of image sensor, a data compression dark current elimination system which based on the DPCM-Huffman compression algorithm is presented in this paper. It is realized by hardware. Before the system works, the coding compression of the image sensor dark current data is executed with DPCM and Huffman compression algorithm, and then the compressed data is stored in the Flash memory. In the working process of the image sensor, the data in the reader-writer memory is used to carry out Huffman and DPCM decoding to eliminate the dark current in image sensors. The experiment proves that, after the processing by the dark current compression elimination system, taking the CMOS image senor whose revolution is 256X256 as an example, the system's data compression ratio is 3. 12, the quantity of data is decreased by a factor of about 1/3, and the work speed is raised 3 times. Therefore, the decompression system proposed in this paper improved the data compression ratio, ensured the data accuracy and improved the working speed of the image
An extensive Markov system for ECG exact coding.
Tai, S C
1995-02-01
In this paper, an extensive Markov process, which considers both the coding redundancy and the intersample redundancy, is presented to measure the entropy value of an ECG signal more accurately. It utilizes the intersample correlations by predicting the incoming n samples based on the previous m samples which constitute an extensive Markov process state. Theories of the extensive Markov process and conventional n repeated applications of m-th order Markov process are studied first in this paper. After that, they are realized for ECG exact coding. Results show that a better performance can be achieved by our system. The average code length for the extensive Markov system on the second difference signals was 2.512 b/sample, while the average Huffman code length for the second difference signals was 3.326 b/sample.
On Real-Time and Causal Secure Source Coding
Kaspi, Yonatan
2012-01-01
We investigate two source coding problems with secrecy constraints. In the first problem we consider real--time fully secure transmission of a memoryless source. We show that although classical variable--rate coding is not an option since the lengths of the codewords leak information on the source, the key rate can be as low as the average Huffman codeword length of the source. In the second problem we consider causal source coding with a fidelity criterion and side information at the decoder and the eavesdropper. We show that when the eavesdropper has degraded side information, it is optimal to first use a causal rate distortion code and then encrypt its output with a key.
Categorizing Ideas about Trees: A Tree of Trees
Fisler, Marie; Lecointre, Guillaume
2013-01-01
The aim of this study is to explore whether matrices and MP trees used to produce systematic categories of organisms could be useful to produce categories of ideas in history of science. We study the history of the use of trees in systematics to represent the diversity of life from 1766 to 1991. We apply to those ideas a method inspired from coding homologous parts of organisms. We discretize conceptual parts of ideas, writings and drawings about trees contained in 41 main writings; we detect shared parts among authors and code them into a 91-characters matrix and use a tree representation to show who shares what with whom. In other words, we propose a hierarchical representation of the shared ideas about trees among authors: this produces a “tree of trees.” Then, we categorize schools of tree-representations. Classical schools like “cladists” and “pheneticists” are recovered but others are not: “gradists” are separated into two blocks, one of them being called here “grade theoreticians.” We propose new interesting categories like the “buffonian school,” the “metaphoricians,” and those using “strictly genealogical classifications.” We consider that networks are not useful to represent shared ideas at the present step of the study. A cladogram is made for showing who is sharing what with whom, but also heterobathmy and homoplasy of characters. The present cladogram is not modelling processes of transmission of ideas about trees, and here it is mostly used to test for proximity of ideas of the same age and for categorization. PMID:23950877
Transference & Retrieval of Pulse-code modulation Audio over Short Messaging Service
Khan, Muhammad Fahad
2012-01-01
The paper presents the method of transferring PCM (Pulse-Code Modulation) based audio messages through SMS (Short Message Service) over GSM (Global System for Mobile Communications) network. As SMS is text based service, and could not send voice. Our method enables voice transferring through SMS, by converting PCM audio into characters. Than Huffman coding compression technique is applied in order to reduce numbers of characters which will latterly set as payload text of SMS. Testing the said method we develop an application using J2me platform
无损图像压缩编码方法及其比较%A Study on Ways of Lossless Image Compression and Coding and Relevant Comparisons
Institute of Scientific and Technical Information of China (English)
冉晓娟
2014-01-01
This essay studies the principles of three ways of lossless image compression including run length coding, LZW coding and Huffman coding as well as making comparative analyses of them,which contributes to the applica-tions of various coding methods in lossless image compression.%研究游程编码，LZW编码和哈夫曼编码三种无损图像压缩的原理，并对其进行分析，这有助于针对不同类型的图像选择合适的压缩编码方法。
Institute of Scientific and Technical Information of China (English)
刘楠; 韩丽芳; 夏坤峰; 曲通
2014-01-01
在软件同源性检测方法中，基于抽象语法树的比对方法能够有效地检测出基于代码全文拷贝、修改变量名、调整代码顺序等的抄袭手段，被广泛用于抄袭检测工具中。但基于抽象语法树的比对方法对于修改变量类型和添加无意义变量的抄袭手段束手无策。针对这种情况，提出了一种基于抽象语法树的改进思想，该思想通过剪去语法树中影响判断的叶子节点的手段来还原检测原文抄袭，能够达到有效检测修改变量类型和添加无意义变量等抄袭的目的。%Among the source code plagiarism detection algorithms used in software engineering, the algorithm based on abstract syntax tree (AST) can effectively detect those plagiarized cases of copying with no modiifcation, modifying variable names and changing the source code sequence, but the algorithm can not detect the cases of modifying the variable type, adding no useful variables and so on. In this paper, we propose an improved algorithm based on abstract syntax tree, which is implemented by cutting out the syntax tree leaf nodes that may affect the judgment. This improved algorithm can positively detect the plagiarism cases described in the previous.
Huffman decoding module based on the hardware and software co-design%基于软硬件协同设计的Huffman解码模块
Institute of Scientific and Technical Information of China (English)
刘华; 刘卫东; 邢文峰
2011-01-01
With the rapid development of multi-media technology,digital audio technology also developed rapidly.MP3 is a lossy audio compression format with high compression rate,at present in many fields it has begun to widely used with good market prospects.This paper is mainly based on hardware and software co-design approach to implement the Huffman decoding module of MP3.The solution proposed in this paper can not only efficiently realize the Huffman decoding module of MP3 but also be applied to the Huffman decoding module of WMA,AAC and other audio formats,The approach ensure efficient while taking into account the module＇s versatility.%随着多媒体技术的迅猛发展,数字音频技术也快速发展起来。MP3是一种有损音频压缩编码,其压缩程度很高,目前在很多领域已经开始广泛应用,具有良好的市场前景。主要基于软硬件协同设计的方法,实现MP3的Huff-man解码模块。提出的解决方案不仅可以实现对MP3的Huffman模块的高效解码,同样也可以应用于WMA、AAC等其他音频格式的Huffman模块,在保证高效的同时,兼顾了模块的通用性。
Advanced GF(32) nonbinary LDPC coded modulation with non-uniform 9-QAM outperforming star 8-QAM.
Liu, Tao; Lin, Changyu; Djordjevic, Ivan B
2016-06-27
In this paper, we first describe a 9-symbol non-uniform signaling scheme based on Huffman code, in which different symbols are transmitted with different probabilities. By using the Huffman procedure, prefix code is designed to approach the optimal performance. Then, we introduce an algorithm to determine the optimal signal constellation sets for our proposed non-uniform scheme with the criterion of maximizing constellation figure of merit (CFM). The proposed nonuniform polarization multiplexed signaling 9-QAM scheme has the same spectral efficiency as the conventional 8-QAM. Additionally, we propose a specially designed GF(32) nonbinary quasi-cyclic LDPC code for the coded modulation system based on the 9-QAM non-uniform scheme. Further, we study the efficiency of our proposed non-uniform 9-QAM, combined with nonbinary LDPC coding, and demonstrate by Monte Carlo simulation that the proposed GF(23) nonbinary LDPC coded 9-QAM scheme outperforms nonbinary LDPC coded uniform 8-QAM by at least 0.8dB.
Institute of Scientific and Technical Information of China (English)
王斐
2015-01-01
The coding system of House-Tree-Person drawing contains 53 items. 21 subjects including college students and schizophrenics took part in this research. The results indicate that the system has good scorer's agreement,there are 20 items and 5 dimensions which appear statistical differences between college students and schizophrenics.%本研究设计了包含53项绘画特征的大学生房树人编码评定系统，对21名大学生和精神分裂症被试施测并编码评分，结果表明，该编码评分系统具有较好的评分者一致性，共有20个绘画特征和5个二级维度具有显著的组间差异。
Context-based lossless image compression with optimal codes for discretized Laplacian distributions
Giurcaneanu, Ciprian Doru; Tabus, Ioan; Stanciu, Cosmin
2003-05-01
Lossless image compression has become an important research topic, especially in relation with the JPEG-LS standard. Recently, the techniques known for designing optimal codes for sources with infinite alphabets have been applied for the quantized Laplacian sources which have probability mass functions with two geometrically decaying tails. Due to the simple parametric model of the source distribution the Huffman iterations are possible to be carried out analytically, using the concept of reduced source, and the final codes are obtained as a sequence of very simple arithmetic operations, avoiding the need to store coding tables. We propose the use of these (optimal) codes in conjunction with context-based prediction, for noiseless compression of images. To reduce further the average code length, we design Escape sequences to be employed when the estimation of the distribution parameter is unreliable. Results on standard test files show improvements in compression ratio when comparing with JPEG-LS.
Research on compression and improvement of vertex chain code
Yu, Guofang; Zhang, Yujie
2009-10-01
Combined with the Huffman encoding theory, the code 2 with highest emergence-probability and continution-frequency is indicated by a binary number 0,the combination of 1 and 3 with higher emergence-probability and continutionfrequency are indicated by two binary number 10,and the corresponding frequency-code are attached to the two kinds of code,the length of the frequency-code can be assigned beforehand or adaptive automatically,the code 1 and 3 with lowest emergence-probability and continution-frequency are indicated by the binary number 110 and 111 respectively.The relative encoding efficiency and decoding efficiency are supplemented to the current performance evaluation system of the chain code.the new chain code is compared with a current chain code through the test system progamed by VC++, the results show that the basic performances of the new chain code are significantly improved, and the performance advantages are improved with the size increase of the graphics.
PoInTree: A Polar and Interactive Phylogenetic Tree
Institute of Scientific and Technical Information of China (English)
Carreras Marco; Gianti Eleonora; Sartori Luca; Plyte Simon Edward; Isacchi Antonella; Bosotti Roberta
2005-01-01
PoInTree (Polar and Innteractive Tree) is an application that allows to build, visualize, and customize phylogenetic trees in a polar, interactive, and highly flexible view. It takes as input a FASTA file or multiple alignment formats. Phylogenetic tree calculation is based on a sequence distance method and utilizes the Neighbor Joining (NJ) algorithm. It also allows displaying precalculated trees of the major protein families based on Pfam classification. In PoInTree, nodes can be dynamically opened and closed and distances between genes are graphically represented.Tree root can be centered on a selected leaf. Text search mechanism, color-coding and labeling display are integrated. The visualizer can be connected to an Oracle database containing information on sequences and other biological data, helping to guide their interpretation within a given protein family across multiple species.The application is written in Borland Delphi and based on VCL Teechart Pro 6 graphical component (Steema software).
Source-channel optimized trellis codes for bitonal image transmission over AWGN channels.
Kroll, J M; Phamdo, N
1999-01-01
We consider the design of trellis codes for transmission of binary images over additive white Gaussian noise (AWGN) channels. We first model the image as a binary asymmetric Markov source (BAMS) and then design source-channel optimized (SCO) trellis codes for the BAMS and AWGN channel. The SCO codes are shown to be superior to Ungerboeck's codes by approximately 1.1 dB (64-state code, 10(-5) bit error probability), We also show that a simple "mapping conversion" method can be used to improve the performance of Ungerboeck's codes by approximately 0.4 dB (also 64-state code and 10 (-5) bit error probability). We compare the proposed SCO system with a traditional tandem system consisting of a Huffman code, a convolutional code, an interleaver, and an Ungerboeck trellis code. The SCO system significantly outperforms the tandem system. Finally, using a facsimile image, we compare the image quality of an SCO code, an Ungerboeck code, and the tandem code, The SCO code yields the best reconstructed image quality at 4-5 dB channel SNR.
Kucharczyk, Robert A
2012-01-01
In this note we discuss trees similar to the Calkin-Wilf tree, a binary tree that enumerates all positive rational numbers in a simple way. The original construction of Calkin and Wilf is reformulated in a more algebraic language, and an elementary application of methods from analytic number theory gives restrictions on possible analogues.
Martensen, Anna L.; Butler, Ricky W.
1987-01-01
The Fault Tree Compiler Program is a new reliability tool used to predict the top event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N gates. The high level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precise (within the limits of double precision floating point arithmetic) to the five digits in the answer. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Corporation VAX with the VMS operation system.
Tree compression with top trees
DEFF Research Database (Denmark)
Bille, Philip; Gørtz, Inge Li; Landau, Gad M.;
2015-01-01
We introduce a new compression scheme for labeled trees based on top trees. Our compression scheme is the first to simultaneously take advantage of internal repeats in the tree (as opposed to the classical DAG compression that only exploits rooted subtree repeats) while also supporting fast...
Tree compression with top trees
DEFF Research Database (Denmark)
Bille, Philip; Gørtz, Inge Li; Landau, Gad M.
2013-01-01
We introduce a new compression scheme for labeled trees based on top trees [3]. Our compression scheme is the first to simultaneously take advantage of internal repeats in the tree (as opposed to the classical DAG compression that only exploits rooted subtree repeats) while also supporting fast...
Tree compression with top trees
DEFF Research Database (Denmark)
Bille, Philip; Gørtz, Inge Li; Landau, Gad M.
2015-01-01
We introduce a new compression scheme for labeled trees based on top trees. Our compression scheme is the first to simultaneously take advantage of internal repeats in the tree (as opposed to the classical DAG compression that only exploits rooted subtree repeats) while also supporting fast...
IPv6下基于Huffman编码的路径回溯算法研究%Research of IPv6 path reconstruction algorithm based on Huffman code
Institute of Scientific and Technical Information of China (English)
胡清钟; 张斌
2013-01-01
包标记算法是一种常用的IP回溯算法,该算法把路径信息标记到IP报头的标记区域中,可以根据标记包中的标记信息重构出攻击路径,从而追踪到攻击的源头.由于标记空间大小的限制,标记信息有限,往往需要多个标记包才能重构出一条攻击路径,路径重构算法的复杂度较高,效率和准确率较低.为了解决这一问题,提出一种基于Huffman编码的路径回溯算法,将与上一跳路由器相关的链路信息以Huffman编码方式标记到标记区域,且不需将标记信息转存在中间节点.该算法适用于IPv6网络,仅需一个标记包就能准确地重构出攻击路径.实验结果表明,本文提出的算法在重构路径时速度快、效率和准确率高.
Compressed Technology Based on Huffman Code by Java%基于Huffman编码的压缩技术的Java实现
Institute of Scientific and Technical Information of China (English)
陈旭辉; 范肖南; 巩天宁
2008-01-01
当前,广泛采用的无损压缩技术主要有2种,一种是短语式压缩,另一种是编码式压缩.本文介绍采用java编程语言利用Huffman算法实现文件的压缩功能,是实现的编码式压缩技术.
基于Huffman编码的文本信息隐藏算法%Algorithm of Text Information Hiding Based on Huffman Coding
Institute of Scientific and Technical Information of China (English)
戴祖旭; 洪帆; 董洁
2007-01-01
自然语言句子可以变换为词性标记串或句型.该文提出了基于句型Huffman编码的信息隐藏算法,根据句型分布构造Huffman编码,秘密信息解码为句型.句型在载体文本中的位置是密钥,对句型作Huffman压缩编码即可提取秘密信息,给出了信息隐藏容量公式.该算法不需要修改载体文本.
Design of Experiment Teaching Platform for Huffman coding Based on MATLAB%基于MATLAB的Huffman编码实验教学平台设计
Institute of Scientific and Technical Information of China (English)
李荣
2015-01-01
针对Huffman编码实验教学中的有关计算问题,本文利用MATLAB的图形用户界面,设计开发了一个简单实用的实验教学平台.该平台实现了理论和实验相结合,为Huffman编码的实验教学提供了一个有效的工具.
比较应用STL实现Huffman编码的两种方法%Comparing Two Ways about Programming of Huffman Coding with STL
Institute of Scientific and Technical Information of China (English)
孙宏; 章小莉; 赵越
2010-01-01
Huffman编码作为信息不丢失压缩方法在现代通信、多媒体技术等领域广泛运用.研究用C++的标准模板库STL实现Huffman编码算法具有现实意义.本文讨论用STL资源的vector容器和heap技术实现Huffman编码算法编程,并比较两种实现方法的性能,指出使用STL资源时需要注意的事项.
基于Huffman编码的DSP图像无损压缩系统%DSP Lossless Image Compression System Based on Huffman Coding
Institute of Scientific and Technical Information of China (English)
邹文辉
2014-01-01
当今社会是一个大数据时代,信息量巨大.每天一睁开双眼,图像和视频就席卷而来.人们对图像的依赖越来越多,对图像的要求也越来越高,既追求保真度高,又希望占用空间少,因此对图像压缩也提出了更高的要求.本系统基于TMS320DM6437平台搭建,利用Huffman编码实现图像无损压缩,压缩比达1.77.
Efficient Huffman Codes-based Symmetrical-key Cryptography%基于Huffman编码的高效对称密码体制研究
Institute of Scientific and Technical Information of China (English)
魏茜; 龙冬阳
2010-01-01
当前网络中大规模数据的存储和传输需求使得数据压缩与加密相结合的研究引起了越来越多研究者的关注.虽然在信元的概率密度函数(Possibility Mass Function,PMF)保密的前提下使用Huffman编码压缩数据后得到的编码序列极难破译,但该方法中作为密钥的PMF安全性差且难于存储和传输因此很难被实际应用.为解决这个问题本文提出一种基于Huffman编码的一次一密高安全性对称密码体制.该方案使用具有多项式时间复杂度的Huffman树重构算法与有限域插值算法生成密钥,能够保证密钥长度非常短且在密钥被部分获取的情况下对加密体制的破解依然困难.此外本文证明方案的有效性和安全性并给出一个应用实例.
New Data Compression Algorithm Based on Huffman Coding%运用Huffman编码进行数据压缩的新算法
Institute of Scientific and Technical Information of China (English)
何昭青
2008-01-01
探讨研究文件压缩的一种新思路,在进行文件压缩时,把文件看成为"0"和"1"组成的二进制流,定义若干个二进制位为一个"字",这样文件就是由"字"组成的流,统计这些不同"字"出现的概率,然后利用Huffman算法进行编码压缩;讨论了各类文件在不同"字"下的压缩情况,并给出各种情况下的实验结果.
LOB Data Exchange Based on Huffman Coding and XML%基于Huffman编码与XML的大对象数据交换
Institute of Scientific and Technical Information of China (English)
贾长云; 朱跃龙; 朱敏
2006-01-01
XML作为异构数据交换的标准格式在数据交换平台中得到了广泛的应用,多媒体数据由于其容量巨大在数据库中往往作为大对象数据来保存,因此在异构数据交换中必然涉及到大对象数据交换的问题.文章讨论了Huffman编码的原理并提出了基于XML使用Huffman编码方式实现大对象数据交换的方法,设计了相应的实现模型,对异构数据库大对象数据交换的实现具有一定的借鉴意义.
The MP3 Steganography Algorithm Based on Huffman Coding%基于Huffman编码的MP3隐写算法
Institute of Scientific and Technical Information of China (English)
高海英
2007-01-01
针对MP3音频的编码特点,提出了基于Huffman码字替换原理的音频隐写算法.与以往的MP3隐写算法相比,该算法直接在MP3帧数据流中的Huffman码字上嵌入隐蔽信息,不需要局部解码,具有透明度高、嵌入量大、计算量小的特点.通过实验分析了算法的透明性、嵌入量、码字的统计特性等方面的特点.
基于Huffman编码的图像压缩解压研究%Huffman-based Coding of Image Compression Decompression
Institute of Scientific and Technical Information of China (English)
饶兴
2011-01-01
根据BMP图像的特点,提出了基于Huffman编码的压缩方法,分别采用RGB统一编码和RGB分别编码两种方式对图像进行压缩和解压程序设计,然后对多幅图像进行了压缩和解压实验,最后对实验结果进行了相关的分析.
The Demo Animation Design of Huffman Coding Process Based on Flash%基于Flash的Huffman编码过程的演示动画设计
Institute of Scientific and Technical Information of China (English)
魏三强
2013-01-01
Huffman编码是数据压缩技术中的一个重要的知识点,很有必要运用最佳的现代化教学手段传播该知识.通过使用Flash软件及其ActionScript编程技术制作的演示动画课件,构建了新的视觉文化,实现了Huffman编码过程的较高仿真演示.由于它具有形象直观、生动有趣、易于学习等特点,对于提高Huffman编码知识点的教与学的效率,具有一定的辅助作用.
The Methed to Compress the File Using Huffman Code%用Huffman编码进行文件压缩的方法
Institute of Scientific and Technical Information of China (English)
潘玮华
2010-01-01
介绍了使用Huffman编码进行文件压缩的思路和压缩的方法.详细阐述了该方法所用类的设计和压缩、解压的具体设计方法,并给出使用C++语言描述的完整的程序.
Fast minimum-redundancy prefix coding for real-time space data compression
Huang, Bormin
2007-09-01
The minimum-redundancy prefix-free code problem is to determine an array l = {l I ,..., f n} of n integer codeword lengths, given an array f = {f I ,..., f n} of n symbol occurrence frequencies, such that the Kraft-McMillan inequality [equation] holds and the number of the total coded bits [equation] is minimized. Previous minimum-redundancy prefix-free code based on Huffman's greedy algorithm solves this problem in O (n) time if the input array f is sorted; but in O (n log n) time if f is unsorted. In this paper a fast algorithm is proposed to solve this problem in linear time if f is unsorted. It is suitable for real-time applications in satellite communication and consumer electronics. We also develop its VLSI architecture that consists of four modules, namely, the frequency table builder, the codeword length table builder, the codeword table builder, and the input-to-codeword mapper.
Jonge, de, H.J.
2002-01-01
Dividing software systems in components improves software reusability as well as software maintainability. Components live at several levels, we concentrate on the implementation level where components are formed by source files, divided over directory structures. Such source code components are usually strongly coupled in the directory structure of a software system. Their compilation is usually controlled by a single global build process. This entangling of source trees and build processes ...
Ganzinger, Harald; Nieuwenhuis, Robert; Nivela, Pilar
2001-01-01
Indexing data structures are well-known to be crucial for the efficiency of the current state-of-the-art theorem provers. Examples are \\emph{discrimination trees}, which are like tries where terms are seen as strings and common prefixes are shared, and \\emph{substitution trees}, where terms keep their tree structure and all common \\emph{contexts} can be shared. Here we describe a new indexing data structure, \\emph{context trees}, where, by means of a limited kind of conte...
Cochrane, John. H.; Longstaff, Francis A.; Santa-Clara, Pedro
2004-01-01
We solve a model with two â€œLucas trees.â€ Each tree has i.i.d. dividend growth. The investor has log utility and consumes the sum of the two treesâ€™ dividends. This model produces interesting asset-pricing dynamics, despite its simple ingredients. Investors want to rebalance their portfolios after any change in value. Since the size of the trees is fixed, however, prices must adjust to oï¬€set this desire. As a result, expected returns, excess returns, and return volatility all vary throug...
Tolman, Marvin
2005-01-01
Students love outdoor activities and will love them even more when they build confidence in their tree identification and measurement skills. Through these activities, students will learn to identify the major characteristics of trees and discover how the pace--a nonstandard measuring unit--can be used to estimate not only distances but also the…
Baños, Hector; Bushek, Nathaniel; Davidson, Ruth; Gross, Elizabeth; Harris, Pamela E.; Krone, Robert; Long, Colby; Stewart, Allen; WALKER, Robert
2016-01-01
We introduce the package PhylogeneticTrees for Macaulay2 which allows users to compute phylogenetic invariants for group-based tree models. We provide some background information on phylogenetic algebraic geometry and show how the package PhylogeneticTrees can be used to calculate a generating set for a phylogenetic ideal as well as a lower bound for its dimension. Finally, we show how methods within the package can be used to compute a generating set for the join of any two ideals.
Game tree algorithms and solution trees
W.H.L.M. Pijls (Wim); A. de Bruin (Arie)
1998-01-01
textabstractIn this paper, a theory of game tree algorithms is presented, entirely based upon the concept of solution tree. Two types of solution trees are distinguished: max and min trees. Every game tree algorithm tries to prune nodes as many as possible from the game tree. A cut-off criterion in
DEFF Research Database (Denmark)
Appelt, Ane L; Rønde, Heidi S
2013-01-01
The photo shows a close-up of a Lichtenberg figure – popularly called an “electron tree” – produced in a cylinder of polymethyl methacrylate (PMMA). Electron trees are created by irradiating a suitable insulating material, in this case PMMA, with an intense high energy electron beam. Upon discharge......, during dielectric breakdown in the material, the electrons generate branching chains of fractures on leaving the PMMA, producing the tree pattern seen. To be able to create electron trees with a clinical linear accelerator, one needs to access the primary electron beam used for photon treatments. We...... appropriated a linac that was being decommissioned in our department and dismantled the head to circumvent the target and ion chambers. This is one of 24 electron trees produced before we had to stop the fun and allow the rest of the accelerator to be disassembled....
DEFF Research Database (Denmark)
Appelt, Ane L; Rønde, Heidi S
2013-01-01
The photo shows a close-up of a Lichtenberg figure – popularly called an “electron tree” – produced in a cylinder of polymethyl methacrylate (PMMA). Electron trees are created by irradiating a suitable insulating material, in this case PMMA, with an intense high energy electron beam. Upon discharge......, during dielectric breakdown in the material, the electrons generate branching chains of fractures on leaving the PMMA, producing the tree pattern seen. To be able to create electron trees with a clinical linear accelerator, one needs to access the primary electron beam used for photon treatments. We...... appropriated a linac that was being decommissioned in our department and dismantled the head to circumvent the target and ion chambers. This is one of 24 electron trees produced before we had to stop the fun and allow the rest of the accelerator to be disassembled....
Compressing industrial computed tomography images by means of contour coding
Jiang, Haina; Zeng, Li
2013-10-01
An improved method for compressing industrial computed tomography (CT) images is presented. To have higher resolution and precision, the amount of industrial CT data has become larger and larger. Considering that industrial CT images are approximately piece-wise constant, we develop a compression method based on contour coding. The traditional contour-based method for compressing gray images usually needs two steps. The first is contour extraction and then compression, which is negative for compression efficiency. So we merge the Freeman encoding idea into an improved method for two-dimensional contours extraction (2-D-IMCE) to improve the compression efficiency. By exploiting the continuity and logical linking, preliminary contour codes are directly obtained simultaneously with the contour extraction. By that, the two steps of the traditional contour-based compression method are simplified into only one. Finally, Huffman coding is employed to further losslessly compress preliminary contour codes. Experimental results show that this method can obtain a good compression ratio as well as keeping satisfactory quality of compressed images.
Embedded foveation image coding.
Wang, Z; Bovik, A C
2001-01-01
The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.
Interpreting Tree Ensembles with inTrees
Deng, Houtao
2014-01-01
Tree ensembles such as random forests and boosted trees are accurate but difficult to understand, debug and deploy. In this work, we provide the inTrees (interpretable trees) framework that extracts, measures, prunes and selects rules from a tree ensemble, and calculates frequent variable interactions. An rule-based learner, referred to as the simplified tree ensemble learner (STEL), can also be formed and used for future prediction. The inTrees framework can applied to both classification an...
Springer, Mark S; Gatesy, John
2016-01-01
Higher-level relationships among placental mammals are mostly resolved, but several polytomies remain contentious. Song et al. (2012) claimed to have resolved three of these using shortcut coalescence methods (MP-EST, STAR) and further concluded that these methods, which assume no within-locus recombination, are required to unravel deep-level phylogenetic problems that have stymied concatenation. Here, we reanalyze Song et al.'s (2012) data and leverage these re-analyses to explore key issues in systematics including the recombination ratchet, gene tree stoichiometry, the proportion of gene tree incongruence that results from deep coalescence versus other factors, and simulations that compare the performance of coalescence and concatenation methods in species tree estimation. Song et al. (2012) reported an average locus length of 3.1 kb for the 447 protein-coding genes in their phylogenomic dataset, but the true mean length of these loci (start codon to stop codon) is 139.6 kb. Empirical estimates of recombination breakpoints in primates, coupled with consideration of the recombination ratchet, suggest that individual coalescence genes (c-genes) approach ∼12 bp or less for Song et al.'s (2012) dataset, three to four orders of magnitude shorter than the c-genes reported by these authors. This result has general implications for the application of coalescence methods in species tree estimation. We contend that it is illogical to apply coalescence methods to complete protein-coding sequences. Such analyses amalgamate c-genes with different evolutionary histories (i.e., exons separated by >100,000 bp), distort true gene tree stoichiometry that is required for accurate species tree inference, and contradict the central rationale for applying coalescence methods to difficult phylogenetic problems. In addition, Song et al.'s (2012) dataset of 447 genes includes 21 loci with switched taxonomic names, eight duplicated loci, 26 loci with non-homologous sequences that are
Application of grammar-based codes for lossless compression of digital mammograms
Li, Xiaoli; Krishnan, Srithar; Ma, Ngok-Wah
2006-01-01
A newly developed grammar-based lossless source coding theory and its implementation was proposed in 1999 and 2000, respectively, by Yang and Kieffer. The code first transforms the original data sequence into an irreducible context-free grammar, which is then compressed using arithmetic coding. In the study of grammar-based coding for mammography applications, we encountered two issues: processing time and limited number of single-character grammar G variables. For the first issue, we discover a feature that can simplify the matching subsequence search in the irreducible grammar transform process. Using this discovery, an extended grammar code technique is proposed and the processing time of the grammar code can be significantly reduced. For the second issue, we propose to use double-character symbols to increase the number of grammar variables. Under the condition that all the G variables have the same probability of being used, our analysis shows that the double- and single-character approaches have the same compression rates. By using the methods proposed, we show that the grammar code can outperform three other schemes: Lempel-Ziv-Welch (LZW), arithmetic, and Huffman on compression ratio, and has similar error tolerance capabilities as LZW coding under similar circumstances.
Ultraspectral sounder data compression using error-detecting reversible variable-length coding
Huang, Bormin; Ahuja, Alok; Huang, Hung-Lung; Schmit, Timothy J.; Heymann, Roger W.
2005-08-01
Nonreversible variable-length codes (e.g. Huffman coding, Golomb-Rice coding, and arithmetic coding) have been used in source coding to achieve efficient compression. However, a single bit error during noisy transmission can cause many codewords to be misinterpreted by the decoder. In recent years, increasing attention has been given to the design of reversible variable-length codes (RVLCs) for better data transmission in error-prone environments. RVLCs allow instantaneous decoding in both directions, which affords better detection of bit errors due to synchronization losses over a noisy channel. RVLCs have been adopted in emerging video coding standards--H.263+ and MPEG-4--to enhance their error-resilience capabilities. Given the large volume of three-dimensional data that will be generated by future space-borne ultraspectral sounders (e.g. IASI, CrIS, and HES), the use of error-robust data compression techniques will be beneficial to satellite data transmission. In this paper, we investigate a reversible variable-length code for ultraspectral sounder data compression, and present its numerical experiments on error propagation for the ultraspectral sounder data. The results show that the RVLC performs significantly better error containment than JPEG2000 Part 2.
Error Correcting Codes for Distributed Control
Sukhavasi, Ravi Teja
2011-01-01
The problem of stabilizing an unstable plant over a noisy communication link is an increasingly important one that arises in applications of networked control systems. Although the work of Schulman and Sahai over the past two decades, and their development of the notions of "tree codes"\\phantom{} and "anytime capacity", provides the theoretical framework for studying such problems, there has been scant practical progress in this area because explicit constructions of tree codes with efficient encoding and decoding did not exist. To stabilize an unstable plant driven by bounded noise over a noisy channel one needs real-time encoding and real-time decoding and a reliability which increases exponentially with decoding delay, which is what tree codes guarantee. We prove that linear tree codes occur with high probability and, for erasure channels, give an explicit construction with an expected decoding complexity that is constant per time instant. We give novel sufficient conditions on the rate and reliability req...
Directory of Open Access Journals (Sweden)
Fabio Burderi
2007-05-01
Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.
National Audubon Society, New York, NY.
Included are an illustrated student reader, "The Story of Trees," a leaders' guide, and a large tree chart with 37 colored pictures. The student reader reviews several aspects of trees: a definition of a tree; where and how trees grow; flowers, pollination and seed production; how trees make their food; how to recognize trees; seasonal changes;…
TreePM Method for Two-Dimensional Cosmological Simulations
Indian Academy of Sciences (India)
Suryadeep Ray
2004-09-01
We describe the two-dimensional TreePM method in this paper. The 2d TreePM code is an accurate and efficient technique to carry out large two-dimensional N-body simulations in cosmology. This hybrid code combines the 2d Barnes and Hut Tree method and the 2d Particle–Mesh method. We describe the splitting of force between the PM and the Tree parts. We also estimate error in force for a realistic configuration. Finally, we discuss some tests of the code.
An interactive programme for weighted Steiner trees
Zanchetta do Nascimento, Marcelo; Ramos Batista, Valério; Raffa Coimbra, Wendhel
2015-01-01
We introduce a fully written programmed code with a supervised method for generating weighted Steiner trees. Our choice of the programming language, and the use of well- known theorems from Geometry and Complex Analysis, allowed this method to be implemented with only 764 lines of effective source code. This eases the understanding and the handling of this beta version for future developments.
JND measurements and wavelet-based image coding
Shen, Day-Fann; Yan, Loon-Shan
1998-06-01
Two major issues in image coding are the effective incorporation of human visual system (HVS) properties and the effective objective measure for evaluating image quality (OQM). In this paper, we treat the two issues in an integrated fashion. We build a JND model based on the measurements of the JND (Just Noticeable Difference) property of HVS. We found that JND does not only depend on the background intensity but also a function of both spatial frequency and patten direction. Wavelet transform, due to its excellent simultaneous Time (space)/frequency resolution, is the best choice to apply the JND model. We mathematically derive an OQM called JND_PSNR that is based on the JND property and wavelet decomposed subbands. JND_PSNR is more consistent with human perception and is recommended as an alternative to the PSNR or SNR. With the JND_PSNR in mind, we proceed to propose a wavelet and JND based codec called JZW. JZW quantizes coefficients in each subband with proper step size according to the subband's importance to human perception. Many characteristics of JZW are discussed, its performance evaluated and compared with other famous algorithms such as EZW, SPIHT and TCCVQ. Our algorithm has 1 - 1.5 dB gain over SPIHT even when we use simple Huffman coding rather than the more efficient adaptive arithmetic coding.
MAP decoding of variable length codes over noisy channels
Yao, Lei; Cao, Lei; Chen, Chang Wen
2005-10-01
In this paper, we discuss the maximum a-posteriori probability (MAP) decoding of variable length codes(VLCs) and propose a novel decoding scheme for the Huffman VLC coded data in the presence of noise. First, we provide some simulation results of VLC MAP decoding and highlight some features that have not been discussed yet in existing work. We will show that the improvement of MAP decoding over the conventional VLC decoding comes mostly from the memory information in the source and give some observations regarding the advantage of soft VLC MAP decoding over hard VLC MAP decoding when AWGN channel is considered. Second, with the recognition that the difficulty in VLC MAP decoding is the lack of synchronization between the symbol sequence and the coded bit sequence, which makes the parsing from the latter to the former extremely complex, we propose a new MAP decoding algorithm by integrating the information of self-synchronization strings (SSSs), one important feature of the codeword structure, into the conventional MAP decoding. A consistent performance improvement and decoding complexity reduction over the conventional VLC MAP decoding can be achieved with the new scheme.
Computationally efficient sub-band coding of ECG signals.
Husøy, J H; Gjerde, T
1996-03-01
A data compression technique is presented for the compression of discrete time electrocardiogram (ECG) signals. The compression system is based on sub-band coding, a technique traditionally used for compressing speech and images. The sub-band coder employs quadrature mirror filter banks (QMF) with up to 32 critically sampled sub-bands. Both finite impulse response (FIR) and the more computationally efficient infinite impulse response (IIR) filter banks are considered as candidates in a complete ECG coding system. The sub-bands are threshold, quantized using uniform quantizers and run-length coded. The output of the run-length coder is further compressed by a Huffman coder. Extensive simulations indicate that 16 sub-bands are a suitable choice for this application. Furthermore, IIR filter banks are preferable due to their superiority in terms of computational efficiency. We conclude that the present scheme, which is suitable for real time implementation on a PC, can provide compression ratios between 5 and 15 without loss of clinical information.
Canfield, Elaine
2002-01-01
Describes a fifth-grade art activity that offers a new approach to creating pictures of Aspen trees. Explains that the students learned about art concepts, such as line and balance, in this lesson. Discusses the process in detail for creating the pictures. (CMK)
Soft and Joint Source-Channel Decoding of Quasi-Arithmetic Codes
Guionnet, Thomas; Guillemot, Christine
2004-12-01
The issue of robust and joint source-channel decoding of quasi-arithmetic codes is addressed. Quasi-arithmetic coding is a reduced precision and complexity implementation of arithmetic coding. This amounts to approximating the distribution of the source. The approximation of the source distribution leads to the introduction of redundancy that can be exploited for robust decoding in presence of transmission errors. Hence, this approximation controls both the trade-off between compression efficiency and complexity and at the same time the redundancy ( excess rate) introduced by this suboptimality. This paper provides first a state model of a quasi-arithmetic coder and decoder for binary and[InlineEquation not available: see fulltext.]-ary sources. The design of an error-resilient soft decoding algorithm follows quite naturally. The compression efficiency of quasi-arithmetic codes allows to add extra redundancy in the form of markers designed specifically to prevent desynchronization. The algorithm is directly amenable for iterative source-channel decoding in the spirit of serial turbo codes. The coding and decoding algorithms have been tested for a wide range of channel signal-to-noise ratios (SNRs). Experimental results reveal improved symbol error rate (SER) and SNR performances against Huffman and optimal arithmetic codes.
Unimodular trees versus Einstein trees
Energy Technology Data Exchange (ETDEWEB)
Alvarez, Enrique; Gonzalez-Martin, Sergio [Universidad Autonoma, Instituto de Fisica Teorica, IFT-UAM/CSIC, Madrid (Spain); Universidad Autonoma de Madrid, Departamento de Fisica Teorica, Madrid (Spain); Martin, Carmelo P. [Universidad Complutense de Madrid (UCM), Departamento de Fisica Teorica I Facultad de Ciencias Fisicas, Madrid (Spain)
2016-10-15
The maximally helicity violating tree-level scattering amplitudes involving three, four or five gravitons are worked out in Unimodular Gravity. They are found to coincide with the corresponding amplitudes in General Relativity. This a remarkable result, insofar as both the propagators and the vertices are quite different in the two theories. (orig.)
Unimodular Trees versus Einstein Trees
Alvarez, Enrique; Martin, Carmelo P
2016-01-01
The maximally helicity violating (MHV) tree level scattering amplitudes involving three, four or five gravitons are worked out in Unimodular Gravity. They are found to coincide with the corresponding amplitudes in General Relativity. This a remarkable result, insofar as both the propagators and the vertices are quite different in both theories.
Unimodular trees versus Einstein trees
Álvarez, Enrique; González-Martín, Sergio; Martín, Carmelo P.
2016-10-01
The maximally helicity violating tree-level scattering amplitudes involving three, four or five gravitons are worked out in Unimodular Gravity. They are found to coincide with the corresponding amplitudes in General Relativity. This a remarkable result, insofar as both the propagators and the vertices are quite different in the two theories.
On Identifying which Intermediate Nodes Should Code in Multicast Networks
DEFF Research Database (Denmark)
Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel
2013-01-01
the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...
Latorre, Jose I
2015-01-01
There exists a remarkable four-qutrit state that carries absolute maximal entanglement in all its partitions. Employing this state, we construct a tensor network that delivers a holographic many body state, the H-code, where the physical properties of the boundary determine those of the bulk. This H-code is made of an even superposition of states whose relative Hamming distances are exponentially large with the size of the boundary. This property makes H-codes natural states for a quantum memory. H-codes exist on tori of definite sizes and get classified in three different sectors characterized by the sum of their qutrits on cycles wrapped through the boundaries of the system. We construct a parent Hamiltonian for the H-code which is highly non local and finally we compute the topological entanglement entropy of the H-code.
Kubilius, Jonas
2014-01-01
Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.
Kuipers, J; Vermaseren, J A M
2013-01-01
We describe the implementation of output code optimization in the open source computer algebra system FORM. This implementation is based on recently discovered techniques of Monte Carlo tree search to find efficient multivariate Horner schemes, in combination with other optimization algorithms, such as common subexpression elimination. For systems for which no specific knowledge is provided it performs significantly better than other methods we could compare with. Because the method has a number of free parameters, we also show some methods by which to tune them to different types of problems.
Institute of Scientific and Technical Information of China (English)
韩雪梅; 彭虎; 杜宏伟; 陈强; 冯焕清
2005-01-01
基于过采样Sigma-delta ADC的波束形成器直接利用过采样Sigma-delta ADC所产生的1bit码流的相位信息进行高质量的聚焦延迟-求和.但此1bit码流速率极高,一般不能直接用USB接口送到计算机进行波束形成等后续处理,须先将其进行无损压缩,即在降低码流速度的同时保留波束形成所需的相位信息.采用Huffman编码方式对高速1bit流进行压缩.结果表明,Huffman编码能实现一半以上的压缩,从而使1bit码流通过USB接口传送成为可能.
Institute of Scientific and Technical Information of China (English)
刘惠敏; 刘繁明; 张琳琳
2008-01-01
气象传真图的信息量非常大.对其进行数据压缩,不仅可以在有限的空间内存储更多的图像,而且可以有效地降低传输时间,对于海上航行的船舶及时地掌握气象信息、降低气象风险大有帮助.在此采用一维修改的Huffman码对气象传真图进行压缩处理,并依据查表法对气象传真图像进行解压处理.实验证明,该方法可以满足气象传真图关于压缩比和压缩速度的要求,该方法是可行的.
DEFF Research Database (Denmark)
Cox, Geoff
Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...
2014-12-01
QPSK Gaussian channels . .......................................................................... 39 vi 1. INTRODUCTION Forward error correction (FEC...Capacity of BSC. 7 Figure 5. Capacity of AWGN channel . 8 4. INTRODUCTION TO POLAR CODES Polar codes were introduced by E. Arikan in [1]. This paper...Under authority of C. A. Wilgenbusch, Head ISR Division EXECUTIVE SUMMARY This report describes the results of the project “More reliable wireless
Finite Sholander Trees, Trees, and their Betweenness
Chvátal, Vašek; Schäfer, Philipp Matthias
2011-01-01
We provide a proof of Sholander's claim (Trees, lattices, order, and betweenness, Proc. Amer. Math. Soc. 3, 369-381 (1952)) concerning the representability of collections of so-called segments by trees, which yields a characterization of the interval function of a tree. Furthermore, we streamline Burigana's characterization (Tree representations of betweenness relations defined by intersection and inclusion, Mathematics and Social Sciences 185, 5-36 (2009)) of tree betweenness and provide a relatively short proof.
N-Square Approach for the Erection of Redundancy Codes
Directory of Open Access Journals (Sweden)
G. Srinivas,
2010-04-01
Full Text Available This paper addresses the area of data compression which is an application of image processing. There are several lossy and lossless coding techniques developed all through the last two decades. Although very high compression can be achieved with lossy compression techniques, they are deficient in obtaining the original image. While lossless compression technique recovers the image exactly. In applications related to medical imaging lossless techniques are required, as the loss of information is deplorable. The objective of image compression is to symbolize an image with a handful number of bits as possible while preserving the quality required for the given application. In this paper we are introducing a new lossless compression technique which even better reduces the entropy there by reducing the average number of bits with the utility of Non BinaryHuffman coding through the use of N-Square approach. Our extensive experimental results demonstrate that the proposed scheme is very competitive and this addresses the limitations of D value in the existing system by proposing a pattern called N-Square approach for it. The newly proposed algorithm provides a good means for lossless image compression.
Institute of Scientific and Technical Information of China (English)
田巍威; 高跃东; 郭彦; 黄京飞; 肖昌; 李作生; 张华堂
2012-01-01
The tree shrews, as an ideal animal model receiving extensive attentions to human disease research, demands essential research tools, in particular cellular markers and monoclonal antibodies for immunological studies. In this paper, a 1365 bp of the full-length CD4 cDNA encoding sequence was cloned from total RNA in peripheral blood of tree shrews, the sequence completes two unknown fragment gaps of tree shrews predicted CD4 cDNA in the GenBank database, and its molecular characteristics were analyzed compared with other mammals by using biology software such as Clustal W2.0 and so forth. The results showed that the extracellular and intracellular domains of tree shrews CD4 amino acid sequence are conserved. The tree shrews CD4 amino acid sequence showed a close genetic relationship with Homo sapiens and Macaca mulatto.. Most regions of the tree shrews CD4 molecule surface showed positive charges as humans. However, compared with CD4 extracellular domain Dl of human, CD4 Dl surface of tree shrews showed more negative charges, and more two N-glycosylation sites, which may affect antibody binding. This study provides a theoretical basis for the preparation and functional studies of CD4 monoclonal antibody.%树鼩作为多种人类疾病研究模型的可能性已受到广泛关注,但尚缺乏研究其免疫功能的基本标志以及单克隆抗体.该实验首先以树鼩外周血总RNA为材料,通过RT-PCR扩增得到长度为1365 bp的树鼩CD4全长编码序列,并确定了数据库中缺失的两个片段,进而通过Clustal W等软件对其序列和分子特征进行分析,发现树鼩CD4氨基酸序列胞外和胞内域保守性较好,且与人类和猴的亲缘关系较近.虽然树鼩和人CD4分子表面大部分区域均带正电荷,但与人CD4胞外域D1相比,树鼩CD4 D1结构区域表面带负电荷较多,且多出两个N-糖基化位点.这些差异对抗体的结合可能存在影响.该研究为今后树鼩CD4单克隆抗体制备及功能研究奠定了基础.
DEFF Research Database (Denmark)
Bahr, Patrick
2012-01-01
Tree automata are traditionally used to study properties of tree languages and tree transformations. In this paper, we consider tree automata as the basis for modular and extensible recursion schemes. We show, using well-known techniques, how to derive from standard tree automata highly modular r...
David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry. Bond
2015-01-01
Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...
DEFF Research Database (Denmark)
Bahr, Patrick
2012-01-01
Tree automata are traditionally used to study properties of tree languages and tree transformations. In this paper, we consider tree automata as the basis for modular and extensible recursion schemes. We show, using well-known techniques, how to derive from standard tree automata highly modular...
CodedStream: live media streaming with overlay coded multicast
Guo, Jiang; Zhu, Ying; Li, Baochun
2003-12-01
Multicasting is a natural paradigm for streaming live multimedia to multiple end receivers. Since IP multicast is not widely deployed, many application-layer multicast protocols have been proposed. However, all of these schemes focus on the construction of multicast trees, where a relatively small number of links carry the multicast streaming load, while the capacity of most of the other links in the overlay network remain unused. In this paper, we propose CodedStream, a high-bandwidth live media distribution system based on end-system overlay multicast. In CodedStream, we construct a k-redundant multicast graph (a directed acyclic graph) as the multicast topology, on which network coding is applied to work around bottlenecks. Simulation results have shown that the combination of k-redundant multicast graph and network coding may indeed bring significant benefits with respect to improving the quality of live media at the end receivers.
Jun, Xie Cheng; Su, Yan; Wei, Zhang
2006-08-01
In this paper, a modified algorithm was introduced to improve Rice coding algorithm and researches of image compression with the CDF (2,2) wavelet lifting scheme was made. Our experiments show that the property of the lossless image compression is much better than Huffman, Zip, lossless JPEG, RAR, and a little better than (or equal to) the famous SPIHT. The lossless compression rate is improved about 60.4%, 45%, 26.2%, 16.7%, 0.4% on average. The speed of the encoder is faster about 11.8 times than the SPIHT's and its efficiency in time can be improved by 162%. The speed of the decoder is faster about 12.3 times than that of the SPIHT's and its efficiency in time can be rasied about 148%. This algorithm, instead of largest levels wavelet transform, has high coding efficiency when the wavelet transform levels is larger than 3. For the source model of distributions similar to the Laplacian, it can improve the efficiency of coding and realize the progressive transmit coding and decoding.
On Identifying which Intermediate Nodes Should Code in Multicast Networks
DEFF Research Database (Denmark)
Pinto, Tiago; Roetter, Daniel Enrique Lucani; Médard, Muriel
2013-01-01
the data packets. Previous work has shown that in lossless wireline networks, the performance of tree-packing mechanisms is comparable to network coding, albeit with added complexity at the time of computing the trees. This means that most nodes in the network need not code. Thus, mechanisms that identify...... intermediate nodes that do require coding is instrumental for the efficient operation of coded networks and can have a significant impact in overall energy consumption. We present a distributed, low complexity algorithm that allows every node to identify if it should code and, if so, through what output link...
Energy Technology Data Exchange (ETDEWEB)
Ravishankar, C., Hughes Network Systems, Germantown, MD
1998-05-08
Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the
Stable feature selection for clinical prediction: exploiting ICD tree structure using Tree-Lasso.
Kamkar, Iman; Gupta, Sunil Kumar; Phung, Dinh; Venkatesh, Svetha
2015-02-01
Modern healthcare is getting reshaped by growing Electronic Medical Records (EMR). Recently, these records have been shown of great value towards building clinical prediction models. In EMR data, patients' diseases and hospital interventions are captured through a set of diagnoses and procedures codes. These codes are usually represented in a tree form (e.g. ICD-10 tree) and the codes within a tree branch may be highly correlated. These codes can be used as features to build a prediction model and an appropriate feature selection can inform a clinician about important risk factors for a disease. Traditional feature selection methods (e.g. Information Gain, T-test, etc.) consider each variable independently and usually end up having a long feature list. Recently, Lasso and related l1-penalty based feature selection methods have become popular due to their joint feature selection property. However, Lasso is known to have problems of selecting one feature of many correlated features randomly. This hinders the clinicians to arrive at a stable feature set, which is crucial for clinical decision making process. In this paper, we solve this problem by using a recently proposed Tree-Lasso model. Since, the stability behavior of Tree-Lasso is not well understood, we study the stability behavior of Tree-Lasso and compare it with other feature selection methods. Using a synthetic and two real-world datasets (Cancer and Acute Myocardial Infarction), we show that Tree-Lasso based feature selection is significantly more stable than Lasso and comparable to other methods e.g. Information Gain, ReliefF and T-test. We further show that, using different types of classifiers such as logistic regression, naive Bayes, support vector machines, decision trees and Random Forest, the classification performance of Tree-Lasso is comparable to Lasso and better than other methods. Our result has implications in identifying stable risk factors for many healthcare problems and therefore can
DEFF Research Database (Denmark)
Cox, Geoff
; alternatives to mainstream development, from performances of the live-coding scene to the organizational forms of commons-based peer production; the democratic promise of social media and their paradoxical role in suppressing political expression; and the market’s emptying out of possibilities for free...... development, Speaking Code unfolds an argument to undermine the distinctions between criticism and practice, and to emphasize the aesthetic and political aspects of software studies. Not reducible to its functional aspects, program code mirrors the instability inherent in the relationship of speech...... expression in the public realm. The book’s line of argument defends language against its invasion by economics, arguing that speech continues to underscore the human condition, however paradoxical this may seem in an era of pervasive computing....
Carrasco Kind, Matias; Brunner, Robert
2013-04-01
TPZ, a parallel code written in python, produces robust and accurate photometric redshift PDFs by using prediction tree and random forests. The code also produces ancillary information about the sample used, such as prior unbiased errors estimations (giving an estimation of performance) and a ranking of importance of variables as well as a map of performance indicating where extra training data is needed to improve overall performance. It is designed to be easy to use and a tutorial is available.
Energy Technology Data Exchange (ETDEWEB)
Delbecq, J.M
1999-07-01
The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)
Optimal codes as Tanner codes with cyclic component codes
DEFF Research Database (Denmark)
Høholdt, Tom; Pinero, Fernando; Zeng, Peng
2014-01-01
In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe...... the codes succinctly using Gröbner bases....
Saadi, Slami; Touiza, Maamar; Kharfi, Fayçal; Guessoum, Abderrezak
2013-12-01
In this work, we present a mixed software/hardware implementation of 2-D signals encoder/decoder using dyadic discrete wavelet transform (DWT) based on quadrature mirror filters (QMF); using fast wavelet Mallat's algorithm. This work is designed and compiled on the embedded development kit EDK6.3i, and the synthesis software, ISE6.3i, which is available with Xilinx Virtex-IIV2MB1000 FPGA. Huffman coding scheme is used to encode the wavelet coefficients so that they can be transmitted progressively through an Ethernet TCP/IP based connection. The possible reconfiguration can be exploited to attain higher performance. The design will be integrated with the neutron radiography system that is used with the Es-Salem research reactor.
Visualizing Mixed Variable-Type Multidimensional Data Using Tree Distances
2015-09-01
VARIABLE-TYPE MULTIDIMENSIONAL DATA USING TREE DISTANCES by Yoav Shaham September 2015 Thesis Advisor: Lyn R. Whitaker Second Reader...TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE VISUALIZING MIXED VARIABLE-TYPE MULTIDIMENSIONAL DATA USING TREE DISTANCES 5...public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) This research explores the use of the tree
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
Healthy trees are important to us all. Trees provide shade, beauty, and homes for wildlife. Trees give us products like paper and wood. Trees can give us all this only if they are healthy.They must be well cared for to remain healthy.
Bit-coded regular expression parsing
DEFF Research Database (Denmark)
Nielsen, Lasse; Henglein, Fritz
2011-01-01
Regular expression parsing is the problem of producing a parse tree of a string for a given regular expression. We show that a compact bit representation of a parse tree can be produced efficiently, in time linear in the product of input string size and regular expression size, by simplifying...... the DFA-based parsing algorithm due to Dub ´e and Feeley to emit the bits of the bit representation without explicitly materializing the parse tree itself. We furthermore show that Frisch and Cardelli’s greedy regular expression parsing algorithm can be straightforwardly modified to produce bit codings...
Classification and regression trees
Breiman, Leo; Olshen, Richard A; Stone, Charles J
1984-01-01
The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Merlin, Emiliano; Buonomo, Umberto; Grassi, Tommaso; Piovan, Lorenzo; Chiosi, Cesare
2009-01-01
We present EvoL, the new release of the Padova N-body code for cosmological simulations of galaxy formation and evolution. In this paper, the basic Tree + SPH code is presented and analysed, together with an overview on the software architectures. EvoL is a flexible parallel Fortran95 code, specifically designed for simulations of cosmological structure formation on cluster, galactic and sub-galactic scales. EvoL is a fully Lagrangian self-adaptive code, based on the classical Oct-tree and on...
NOVEL BIPHASE CODE -INTEGRATED SIDELOBE SUPPRESSION CODE
Institute of Scientific and Technical Information of China (English)
Wang Feixue; Ou Gang; Zhuang Zhaowen
2004-01-01
A kind of novel binary phase code named sidelobe suppression code is proposed in this paper. It is defined to be the code whose corresponding optimal sidelobe suppression filter outputs the minimum sidelobes. It is shown that there do exist sidelobe suppression codes better than the conventional optimal codes-Barker codes. For example, the sidelobe suppression code of length 11 with filter of length 39 has better sidelobe level up to 17dB than that of Barker code with the same code length and filter length.
Sussing Merger Trees: Stability and Convergence
Wang, Yang; Knebe, Alexander; Schneider, Aurel; Srisawat, Chaichalit; Tweed, Dylan; Jung, Intae; Han, Jiaxin; Helly, John; Onions, Julian; Elahi, Pascal J; Thomas, Peter A; Behroozi, Peter; Yi, Sukyoung K; Rodriguez-Gomez, Vicente; Mao, Yao-Yuan; Jing, Yipeng; Lin, Weipeng
2016-01-01
Merger trees are routinely used to follow the growth and merging history of dark matter haloes and subhaloes in simulations of cosmic structure formation. Srisawat et al. (2013) compared a wide range of merger-tree-building codes. Here we test the influence of output strategies and mass resolution on tree-building. We find that, somewhat surprisingly, building the tree from more snapshots does not generally produce more complete trees; instead, it tends to short- en them. Significant improvements are seen for patching schemes which attempt to bridge over occasional dropouts in the underlying halo catalogues or schemes which combine the halo-finding and tree-building steps seamlessly. The adopted output strategy does not affec- t the average number of branches (bushiness) of the resultant merger trees. However, mass resolution has an influence on both main branch length and the bushiness. As the resolution increases, a halo with the same mass can be traced back further in time and will encounter more small pro...
Butler, Ricky W.; Boerschlein, David P.
1993-01-01
Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.
From concatenated codes to graph codes
DEFF Research Database (Denmark)
Justesen, Jørn; Høholdt, Tom
2004-01-01
We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...
Properties and Construction of Polar Codes
Mori, Ryuhei
2010-01-01
Recently, Ar{\\i}kan introduced the method of channel polarization on which one can construct efficient capacity-achieving codes, called polar codes, for any binary discrete memoryless channel. In the thesis, we show that decoding algorithm of polar codes, called successive cancellation decoding, can be regarded as belief propagation decoding, which has been used for decoding of low-density parity-check codes, on a tree graph. On the basis of the observation, we show an efficient construction method of polar codes using density evolution, which has been used for evaluation of the error probability of belief propagation decoding on a tree graph. We further show that channel polarization phenomenon and polar codes can be generalized to non-binary discrete memoryless channels. Asymptotic performances of non-binary polar codes, which use non-binary matrices called the Reed-Solomon matrices, are better than asymptotic performances of the best explicitly known binary polar code. We also find that the Reed-Solomon ma...
Directory of Open Access Journals (Sweden)
Yen Hung Chen
2012-01-01
minimum cost spanning tree T in G such that the total weight in T is at most a given bound B. In this paper, we present two polynomial time approximation schemes (PTASs for the constrained minimum spanning tree problem.
Design of Experiment System for the Source Coding Based on Matlab%基于Matlab的信源编码实验系统的设计
Institute of Scientific and Technical Information of China (English)
宋丽丽; 秦艳
2012-01-01
Source coding is an important content of Information Theory and Coding course . The source coding experimental system is designed using graphical user interface (GUI) of Matlab. The several method of source coding are realized including Shannon coding, Fenno coding, Huffman coding, uniform encoding and non uniform encoding . It proves that this system has characteristics of easy operation and strong ability of interaction, which offers an effective assistant tool for the experimental teaching.%信源编码是“信息论与编码”课程的重要内容。本文利用Matlab中GUI图形用户界面设计了信源编码的实验系统，实现了几种常用的信源编码方法：香农编码、费诺编码、Huffman编码、均匀编码和非均匀编码。实践证明，该系统具有操作简单和交互性强等特点，为实验教学提供了一个有效的辅助工具。
Good Codes From Generalised Algebraic Geometry Codes
Jibril, Mubarak; Ahmed, Mohammed Zaki; Tjhai, Cen
2010-01-01
Algebraic geometry codes or Goppa codes are defined with places of degree one. In constructing generalised algebraic geometry codes places of higher degree are used. In this paper we present 41 new codes over GF(16) which improve on the best known codes of the same length and rate. The construction method uses places of small degree with a technique originally published over 10 years ago for the construction of generalised algebraic geometry codes.
Making Tree Ensembles Interpretable
Hara, Satoshi; Hayashi, Kohei
2016-01-01
Tree ensembles, such as random forest and boosted trees, are renowned for their high prediction performance, whereas their interpretability is critically limited. In this paper, we propose a post processing method that improves the model interpretability of tree ensembles. After learning a complex tree ensembles in a standard way, we approximate it by a simpler model that is interpretable for human. To obtain the simpler model, we derive the EM algorithm minimizing the KL divergence from the ...
Mitchell, William
1992-01-01
This paper, dating from May 1991, contains preliminary (and unpublishable) notes on investigations about iteration trees. They will be of interest only to the specialist. In the first two sections I define notions of support and embeddings for tree iterations, proving for example that every tree iteration is a direct limit of finite tree iterations. This is a generalization to models with extenders of basic ideas of iterated ultrapowers using only ultrapowers. In the final section (which is m...
Scalable still image coding based on wavelet
Yan, Yang; Zhang, Zhengbing
2005-02-01
The scalable image coding is an important objective of the future image coding technologies. In this paper, we present a kind of scalable image coding scheme based on wavelet transform. This method uses the famous EZW (Embedded Zero tree Wavelet) algorithm; we give a high-quality encoding to the ROI (region of interest) of the original image and a rough encoding to the rest. This method is applied well in limited memory space condition, and we encode the region of background according to the memory capacity. In this way, we can store the encoded image in limited memory space easily without losing its main information. Simulation results show it is effective.
DEFF Research Database (Denmark)
Baumbach, Jan; Guo, Jian-Ying; Ibragimov, Rashid
2013-01-01
We study the tree edit distance problem with edge deletions and edge insertions as edit operations. We reformulate a special case of this problem as Covering Tree with Stars (CTS): given a tree T and a set of stars, can we connect the stars in by adding edges between them such that the resulting ...
DEFF Research Database (Denmark)
Baumbach, Jan; Guo, Jiong; Ibragimov, Rashid
2015-01-01
We study the tree edit distance problem with edge deletions and edge insertions as edit operations. We reformulate a special case of this problem as Covering Tree with Stars (CTS): given a tree T and a set of stars, can we connect the stars in by adding edges between them such that the resulting ...
Engelfriet, Joost; Vogler, Heiko
1985-01-01
Macro tree transducers are a combination of top-down tree transducers and macro grammars. They serve as a model for syntax-directed semantics in which context information can be handled. In this paper the formal model of macro tree transducers is studied by investigating typical automata theoretical
Sweeney, Debra; Rounds, Judy
2011-01-01
Trees are great inspiration for artists. Many art teachers find themselves inspired and maybe somewhat obsessed with the natural beauty and elegance of the lofty tree, and how it changes through the seasons. One such tree that grows in several regions and always looks magnificent, regardless of the time of year, is the birch. In this article, the…
DEFF Research Database (Denmark)
Finbow, Arthur; Frendrup, Allan; Vestergaard, Preben D.
cardinality then G is a total well dominated graph. In this paper we study composition and decomposition of total well dominated trees. By a reversible process we prove that any total well dominated tree can both be reduced to and constructed from a family of three small trees....
Brooks, Sarah DeWitt
2010-01-01
This article describes the author's experience in implementing a Wish Tree project in her school in an effort to bring the school community together with a positive art-making experience during a potentially stressful time. The concept of a wish tree is simple: plant a tree; provide tags and pencils for writing wishes; and encourage everyone to…
Brooks, Sarah DeWitt
2010-01-01
This article describes the author's experience in implementing a Wish Tree project in her school in an effort to bring the school community together with a positive art-making experience during a potentially stressful time. The concept of a wish tree is simple: plant a tree; provide tags and pencils for writing wishes; and encourage everyone to…
Engelfriet, Joost; Vogler, Heiko
1985-01-01
Macro tree transducers are a combination of top-down tree transducers and macro grammars. They serve as a model for syntax-directed semantics in which context information can be handled. In this paper the formal model of macro tree transducers is studied by investigating typical automata theoretical
基于 FPGA 的 JPEG 编解码设计%JPEG coding and decoding design based on FPGA
Institute of Scientific and Technical Information of China (English)
张绪珩; 王淑仙
2014-01-01
JPEG （ Joint Photographic Experts Group ）作为一个基本的图像压缩方式，已经得到了广泛的运用。而FPGA具有的并行计算特点，使得越来越多的设备利用FPGA对jpeg文件进行编解码。从整体上介绍JPEG编解码的基本算法，并着重介绍了在DCT和Huffman两个模块中使用的方法。在DCT/IDCT模块中，为了提高处理速度，充分利用FPGA并行处理的特点。对于Huff-man解码模块，采用附加码位宽的查找表方法，并利用综合工具将查找表综合到片内存储器中这一特点来减少资源。%JPEG ( Joint Photographic Experts Group ) , as a basic way of image compression , has been widely used .There are more and more devices processing jpeg files based on FPGA since its parallel processing feature .This paper provides a whole view of the JPEG encoding and decoding and gives out more attention to the DCT/IDCT and Huffman modules .In the DCT/IDCT module , thanks to the parallel processing of the FPGA , the improvement of processing speed can be achieved; as in the Huffman module , additional code length is used , and it costs less resource as the FPGA synthesis tool put the looking up table into the memory within the chip .
Institute of Scientific and Technical Information of China (English)
Degyi
2008-01-01
Trees are flourishing in Lhasa wherever the history exists. There is such a man. He has already been through cus-toms after his annual trek to Lhasa, which he has been doing for over twenty years in succession to visit his tree.Although he has been making this journey for so long,it is neither to visit friends or family,nor is it his hometown.It is a tree that is tied so profoundly to his heart.When the wind blows fiercely on the bare tree and winter snow falls,he stands be-fore the tree with tears of jo...
Energy Technology Data Exchange (ETDEWEB)
Morozov, Dmitriy; Weber, Gunther H.
2014-03-31
Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.
Rollinson, Susan Wells
2012-01-01
The growth of a pine tree is examined by preparing "tree cookies" (cross-sectional disks) between whorls of branches. The use of Christmas trees allows the tree cookies to be obtained with inexpensive, commonly available tools. Students use the tree cookies to investigate the annual growth of the tree and how it corresponds to the number of whorls…
Rollinson, Susan Wells
2012-01-01
The growth of a pine tree is examined by preparing "tree cookies" (cross-sectional disks) between whorls of branches. The use of Christmas trees allows the tree cookies to be obtained with inexpensive, commonly available tools. Students use the tree cookies to investigate the annual growth of the tree and how it corresponds to the number of whorls…
The Fault Tree Compiler (FTC): Program and mathematics
Butler, Ricky W.; Martensen, Anna L.
1989-01-01
The Fault Tree Compiler Program is a new reliability tool used to predict the top-event probability for a fault tree. Five different gate types are allowed in the fault tree: AND, OR, EXCLUSIVE OR, INVERT, AND m OF n gates. The high-level input language is easy to understand and use when describing the system tree. In addition, the use of the hierarchical fault tree capability can simplify the tree description and decrease program execution time. The current solution technique provides an answer precisely (within the limits of double precision floating point arithmetic) within a user specified number of digits accuracy. The user may vary one failure rate or failure probability over a range of values and plot the results for sensitivity analyses. The solution technique is implemented in FORTRAN; the remaining program code is implemented in Pascal. The program is written to run on a Digital Equipment Corporation (DEC) VAX computer with the VMS operation system.
Programming macro tree transducers
DEFF Research Database (Denmark)
Bahr, Patrick; Day, Laurence E.
2013-01-01
A tree transducer is a set of mutually recursive functions transforming an input tree into an output tree. Macro tree transducers extend this recursion scheme by allowing each function to be defined in terms of an arbitrary number of accumulation parameters. In this paper, we show how macro tree...... transducers can be concisely represented in Haskell, and demonstrate the benefits of utilising such an approach with a number of examples. In particular, tree transducers afford a modular programming style as they can be easily composed and manipulated. Our Haskell representation generalises the original...... definition of (macro) tree transducers, abolishing a restriction on finite state spaces. However, as we demonstrate, this generalisation does not affect compositionality....
Programming macro tree transducers
DEFF Research Database (Denmark)
Bahr, Patrick; Day, Laurence E.
2013-01-01
A tree transducer is a set of mutually recursive functions transforming an input tree into an output tree. Macro tree transducers extend this recursion scheme by allowing each function to be defined in terms of an arbitrary number of accumulation parameters. In this paper, we show how macro tree...... transducers can be concisely represented in Haskell, and demonstrate the benefits of utilising such an approach with a number of examples. In particular, tree transducers afford a modular programming style as they can be easily composed and manipulated. Our Haskell representation generalises the original...... definition of (macro) tree transducers, abolishing a restriction on finite state spaces. However, as we demonstrate, this generalisation does not affect compositionality....
Space Time Codes from Permutation Codes
Henkel, Oliver
2006-01-01
A new class of space time codes with high performance is presented. The code design utilizes tailor-made permutation codes, which are known to have large minimal distances as spherical codes. A geometric connection between spherical and space time codes has been used to translate them into the final space time codes. Simulations demonstrate that the performance increases with the block lengths, a result that has been conjectured already in previous work. Further, the connection to permutation codes allows for moderate complex en-/decoding algorithms.
Fundamentals of convolutional coding
Johannesson, Rolf
2015-01-01
Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual
An SPH code for galaxy formation problems; Presentation of the code
Hultman, John; Kaellander, Daniel
1997-01-01
We present and test a code for two-fluid simulations of galaxy formation, one of the fluids being collision-less. The hydrodynamical evolution is solved through the SPH method while gravitational forces are calculated using a tree method. The code is Lagrangian, and fully adaptive both in space and time. A significant fraction gas in simulations of hierarchical galaxy formation ends up in tight clumps where it is, in terms of computational effort, very expensive to integrate the SPH equations...
Pattern Avoidance in Ternary Trees
Gabriel, Nathan; Pudwell, Lara; Tay, Samuel
2011-01-01
This paper considers the enumeration of ternary trees (i.e. rooted ordered trees in which each vertex has 0 or 3 children) avoiding a contiguous ternary tree pattern. We begin by finding recurrence relations for several simple tree patterns; then, for more complex trees, we compute generating functions by extending a known algorithm for pattern-avoiding binary trees. Next, we present an alternate one-dimensional notation for trees which we use to find bijections that explain why certain pairs of tree patterns yield the same avoidance generating function. Finally, we compare our bijections to known "replacement rules" for binary trees and generalize these bijections to a larger class of trees.
Alpha coding of arbitrarily shaped objects for low-bit-rate MPEG-4
Hadar, Ofer; Folkman, Hagai
2001-11-01
This paper presents a new scheme for compact shape-coding which can reduce the needed bandwidth for low bit rate MPEG- 4 applications. Our scheme is based on a coarse representation of the alpha plane with a block size resolution of 8x8 pixels. This arrangement saves bandwidth and reduces the algorithm complexity (number of computations), as compared to the Content-based Arithmetic Encoding (CAE) algorithm. In our algorithm, we encode the alpha plane of a macroblock with only 4 bits, while we can further reduce the number of encoding bits by using the Huffman code. The encoding blocks are only contour macroblocks, transparent macroblocks are considered as background macroblocks, while opaque macroblocks are considered as object macroblocks. We show that the amount of bandwidth saving with representing the alpha-plane can reach a factor of 9.5. Such a scheme is appropriate for mobile applications where there is a lack of both bandwidth and processing power. We also speculate that our scheme will be compatible to the MPEG-4 standard.
Strong Trinucleotide Circular Codes
Directory of Open Access Journals (Sweden)
Christian J. Michel
2011-01-01
Full Text Available Recently, we identified a hierarchy relation between trinucleotide comma-free codes and trinucleotide circular codes (see our previous works. Here, we extend our hierarchy with two new classes of codes, called DLD and LDL codes, which are stronger than the comma-free codes. We also prove that no circular code with 20 trinucleotides is a DLD code and that a circular code with 20 trinucleotides is comma-free if and only if it is a LDL code. Finally, we point out the possible role of the symmetric group ∑4 in the mathematical study of trinucleotide circular codes.
Analysis and dispose of data overflow based on self-adaptive Huffman coding%自适应Huffman算法数据溢出的分析与处理
Institute of Scientific and Technical Information of China (English)
蒋刚; 靳蕃; 肖建
2004-01-01
本文在分析自适应Huffman编码算法的基础上,根据当前计算机的发展趋势,指出了采用自适应Huffman编码算法实现大型数据压缩时潜在的计数器溢出和堆栈式溢出问题,并针对它们分别从软件和硬件的角度提出了相应的解决方法.分析表明这些方法具有较强的实用性,可以有效地解决这两类数据溢出问题.
Institute of Scientific and Technical Information of China (English)
梁忠伟; 张春良; 叶邦彦; 江帆; 胡晓
2009-01-01
IC芯片的远程在线制造监控技术使得芯片图像信息的存储、处理与传送要求不断提高.提出了基于能量熵分布梯度与Huffman编码的IC图像压缩技术,通过建立能量熵分布梯度,可提取反映芯片图像细节的特征平面,并结合Huffman编码技术进行图像的编码压缩,实现在高压缩率情况下对于图像细节特征的描述.经过编程实现与图像解压实验,方法获得了较为稳定的压缩结果与清晰的解压图像,为芯片制造的在线远程监控提供了基础.
Huffman码在码元时间单位不同时的最优性研究%Research in the Huffman coding when 0、1 with different time
Institute of Scientific and Technical Information of China (English)
马沂; 赵东风
2004-01-01
比较Morse码和Huffman码的编码效率,按照Morse码设计码元,用Huffman编码方法来编码.但由于Morse码的基本码元(点和划)所占的时间单位不同,所以若用0、1代表点和划,用Huffrnan来编码,则Huffman的基本码元0、1的时间单位也不同,这不是常规的Huffman编码.对于这种情况下的Huffman码的最优性问题,进行了一些探讨.
大数据条件下自适应Huffman算法潜在问题初探%Analysis and Dispose of Data Overflow Based on Self-Adaptive Huffman Coding
Institute of Scientific and Technical Information of China (English)
蒋刚; 肖建
2005-01-01
介绍了自适应Huffman编码算法的原理.根据当前计算机的发展趋势,针对用自适应Huffman编码算法实现大型数据压缩时潜在两类溢出问题,分别从软件和硬件的角度提出了相应的解决方案.用Visual C++编程进行了试验,试验结果较好地验证了理论分析,并表明该解决方案具有较强的可行性和实用性.
An algorithm to achieve adaptive Huffman coding with C language%基于C语言的自适应Huffman编码算法分析及实现研究
Institute of Scientific and Technical Information of China (English)
文国知
2011-01-01
通过C语言程序,动态统计信源符号概率,逐步构造Huffman编码树,实现了自适应Huffman编码,解决了静态编码树不能根据信源符号的局部变化做出相应变化的主要问题.结果表明,自适应Huffman编码算法压缩率很大,能进一步提高数据传输的效率.
Cosmological simulations with TreeSPH
Katz, N; Hernquist, L E; Katz, Neal; Weinberg, David H; Hernquist, Lars
1995-01-01
We describe numerical methods for incorporating gas dynamics into cosmological simulations and present illustrative applications to the cold dark matter (CDM) scenario. Our evolution code, a version of TreeSPH (Hernquist \\& Katz 1989) generalized to handle comoving coordinates and periodic boundary conditions, combines smoothed--particle hydrodynamics (SPH) with the hierarchical tree method for computing gravitational forces. The Lagrangian hydrodynamics approach and individual time steps for gas particles give the algorithm a large dynamic range, which is essential for studies of galaxy formation in a cosmological context. The code incorporates radiative cooling for an optically thin, primordial composition gas in ionization equilibrium with a user-specified ultraviolet background. We adopt a phenomenological prescription for star formation that gradually turns cold, dense, Jeans-unstable gas into collisionless stars, returning supernova feedback energy to the surrounding medium. In CDM simulations, some...
Blundell, Charles; Heller, Katherine A
2012-01-01
Hierarchical structure is ubiquitous in data across many domains. There are many hier- archical clustering methods, frequently used by domain experts, which strive to discover this structure. However, most of these meth- ods limit discoverable hierarchies to those with binary branching structure. This lim- itation, while computationally convenient, is often undesirable. In this paper we ex- plore a Bayesian hierarchical clustering algo- rithm that can produce trees with arbitrary branching structure at each node, known as rose trees. We interpret these trees as mixtures over partitions of a data set, and use a computationally efficient, greedy ag- glomerative algorithm to find the rose trees which have high marginal likelihood given the data. Lastly, we perform experiments which demonstrate that rose trees are better models of data than the typical binary trees returned by other hierarchical clustering algorithms.
Automatic generation of tree level helicity amplitudes
Stelzer, T
1994-01-01
The program MadGraph is presented which automatically generates postscript Feynman diagrams and Fortran code to calculate arbitrary tree level helicity amplitudes by calling HELAS[1] subroutines. The program is written in Fortran and is available in Unix and VMS versions. MadGraph currently includes standard model interactions of QCD and QFD, but is easily modified to include additional models such as supersymmetry.
Favre, Charles
2004-01-01
This volume is devoted to a beautiful object, called the valuative tree and designed as a powerful tool for the study of singularities in two complex dimensions. Its intricate yet manageable structure can be analyzed by both algebraic and geometric means. Many types of singularities, including those of curves, ideals, and plurisubharmonic functions, can be encoded in terms of positive measures on the valuative tree. The construction of these measures uses a natural tree Laplace operator of independent interest.
Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente, Gabriel
2011-01-01
Galled trees, directed acyclic graphs that model evolutionary histories with isolated hybridization events, have become very popular due to both their biological significance and the existence of polynomial-time algorithms for their reconstruction. In this paper, we establish to which extent several distance measures for the comparison of evolutionary networks are metrics for galled trees, and hence, when they can be safely used to evaluate galled tree reconstruction methods.
A theory of game trees, based on solution trees
W.H.L.M. Pijls (Wim); A. de Bruin (Arie); A. Plaat (Aske)
1996-01-01
textabstractIn this paper a complete theory of game tree algorithms is presented, entirely based upon the notion of a solution tree. Two types of solution trees are distinguished: max and min solution trees respectively. We show that most game tree algorithms construct a superposition of a max and a
A theory of game trees, based on solution trees
W.H.L.M. Pijls (Wim); A. de Bruin (Arie); A. Plaat (Aske)
1996-01-01
textabstractIn this paper a complete theory of game tree algorithms is presented, entirely based upon the notion of a solution tree. Two types of solution trees are distinguished: max and min solution trees respectively. We show that most game tree algorithms construct a superposition of a max and a
DEFF Research Database (Denmark)
Brodal, Gerth Stølting; Sioutas, Spyros; Pantazos, Kostas;
2015-01-01
We present a new overlay, called the Deterministic Decentralized tree (D2-tree). The D2-tree compares favorably to other overlays for the following reasons: (a) it provides matching and better complexities, which are deterministic for the supported operations; (b) the management of nodes (peers......-balancing scheme of elements into nodes is deterministic and general enough to be applied to other hierarchical tree-based overlays. This load-balancing mechanism is based on an innovative lazy weight-balancing mechanism, which is interesting in its own right....
Sexton, Alan P
2010-01-01
The M-tree is a paged, dynamically balanced metric access method that responds gracefully to the insertion of new objects. To date, no algorithm has been published for the corresponding Delete operation. We believe this to be non-trivial because of the design of the M-tree's Insert algorithm. We propose a modification to Insert that overcomes this problem and give the corresponding Delete algorithm. The performance of the tree is comparable to the M-tree and offers additional benefits in terms of supported operations, which we briefly discuss.
DEFF Research Database (Denmark)
Sitchinava, Nodar; Zeh, Norbert
2012-01-01
We present the parallel buffer tree, a parallel external memory (PEM) data structure for batched search problems. This data structure is a non-trivial extension of Arge's sequential buffer tree to a private-cache multiprocessor environment and reduces the number of I/O operations by the number...... of available processor cores compared to its sequential counterpart, thereby taking full advantage of multicore parallelism. The parallel buffer tree is a search tree data structure that supports the batched parallel processing of a sequence of N insertions, deletions, membership queries, and range queries...
Fuzzy Uncertainty Evaluation for Fault Tree Analysis
Energy Technology Data Exchange (ETDEWEB)
Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)
2015-05-15
This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.
New Mexico Univ., Albuquerque. American Indian Law Center.
The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…
Binary Coded Web Access Pattern Tree in Education Domain
Gomathi, C.; Moorthi, M.; Duraiswamy, K.
2008-01-01
Web Access Pattern (WAP), which is the sequence of accesses pursued by users frequently, is a kind of interesting and useful knowledge in practice. Sequential Pattern mining is the process of applying data mining techniques to a sequential database for the purposes of discovering the correlation relationships that exist among an ordered list of…
Information Theoretic Secret Key Generation: Structured Codes and Tree Packing
Nitinawarat, Sirin
2010-01-01
This dissertation deals with a multiterminal source model for secret key generation by multiple network terminals with prior and privileged access to a set of correlated signals complemented by public discussion among themselves. Emphasis is placed on a characterization of secret key capacity, i.e., the largest rate of an achievable secret key,…
J.R. Simpson; E.G. McPherson
2011-01-01
Urban trees can produce a number of benefits, among them improved air quality. Biogenic volatile organic compounds (BVOCs) emitted by some species are ozone precursors. Modifying future tree planting to favor lower-emitting species can reduce these emissions and aid air management districts in meeting federally mandated emissions reductions for these compounds. Changes...
Matching Subsequences in Trees
DEFF Research Database (Denmark)
Bille, Philip; Gørtz, Inge Li
2009-01-01
Given two rooted, labeled trees P and T the tree path subsequence problem is to determine which paths in P are subsequences of which paths in T. Here a path begins at the root and ends at a leaf. In this paper we propose this problem as a useful query primitive for XML data, and provide new...
Structural Equation Model Trees
Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman
2013-01-01
In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…
Tree biology and dendrochemistry
Kevin T. Smith; Walter C. Shortle
1996-01-01
Dendrochemistry, the interpretation of elemental analysis of dated tree rings, can provide a temporal record of environmental change. Using the dendrochemical record requires an understanding of tree biology. In this review, we pose four questions concerning assumptions that underlie recent dendrochemical research: 1) Does the chemical composition of the wood directly...
The major tree nuts include almonds, Brazil nuts, cashew nuts, hazelnuts, macadamia nuts, pecans, pine nuts, pistachio nuts, and walnuts. Tree nut oils are appreciated in food applications because of their flavors and are generally more expensive than other gourmet oils. Research during the last de...
Parallel TreeSPH A Tool for Galaxy Formation
Lia, C; Lia, Cesario; Carraro, Giovanni
1999-01-01
We describe a new implementation of a parallel Tree-SPH code with the aim to simulate Galaxy Formation and Evolution. The code has been parallelized using SHMEM, a Cray proprietary library to handle communications between the 256 processors of the Silicon Graphics T3E massively parallel supercomputer hosted by the Cineca Super-computing Center (Bologna, Italy). The code combines the Smoothed Particle Hydrodynamics (SPH) method to solve hydro-dynamical equations with the popular Barnes and Hut (1986) tree-code to perform gravity calculation with a $N \\times logN$ scaling, and it is based on the scalar Tree-SPH code developed by Carraro et al (1998)[MNRAS 297, 1021]. Parallelization is achieved distributing particles along processors according to a work-load criterium. Benchmarks, in terms of load-balance and scalability, of the code are analised and critically discussed against the adiabatic collapse of an isothermal gas sphere test using $2 \\times 10^{4}$ particles on 8 processors. The code results balanced a...
DEFF Research Database (Denmark)
Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip
2012-01-01
This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....
Abraham, Nikhil
2015-01-01
Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill
Gao, Wen
2015-01-01
This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV
Code-excited linear predictive coding of multispectral MR images
Hu, Jian-Hong; Wang, Yao; Cahill, Patrick
1996-02-01
This paper reports a multispectral code excited linear predictive coding method for the compression of well-registered multispectral MR images. Different linear prediction models and the adaptation schemes have been compared. The method which uses forward adaptive autoregressive (AR) model has proven to achieve a good compromise between performance, complexity and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over non-overlapping square macroblocks. Each macro-block is further divided into several micro-blocks and, the best excitation signals for each microblock are determined through an analysis-by-synthesis procedure. To satisfy the high quality requirement for medical images, the error between the original images and the synthesized ones are further specified using a vector quantizer. The MFCELP method has been applied to 26 sets of clinical MR neuro images (20 slices/set, 3 spectral bands/slice, 256 by 256 pixels/image, 12 bits/pixel). It provides a significant improvement over the discrete cosine transform (DCT) based JPEG method, a wavelet transform based embedded zero-tree wavelet (EZW) coding method, as well as the MSARMA method we developed before.
A Network Coding Approach to Loss Tomography
DEFF Research Database (Denmark)
Sattari, Pegah; Markopoulou, Athina; Fragouli, Christina
2013-01-01
network coding capabilities. We design a framework for estimating link loss rates, which leverages network coding capabilities and we show that it improves several aspects of tomography, including the identifiability of links, the tradeoff between estimation accuracy and bandwidth efficiency......, and the complexity of probe path selection. We discuss the cases of inferring the loss rates of links in a tree topology or in a general topology. In the latter case, the benefits of our approach are even more pronounced compared to standard techniques but we also face novel challenges, such as dealing with cycles...
Wavelet-Based Mixed-Resolution Coding Approach Incorporating with SPT for the Stereo Image
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
With the advances of display technology, three-dimensional(3-D) imaging systems are becoming increasingly popular. One way of stimulating 3-D perception is to use stereo pairs, a pair of images of the same scene acquired from different perspectives. Since there is an inherent redundancy between the images of a stereo pairs, data compression algorithms should be employed to represent stereo pairs efficiently. The proposed techniques generally use blockbased disparity compensation. In order to get the higher compression ratio, this paper employs the wavelet-based mixed-resolution coding technique to incorporate with SPT-based disparity-compensation to compress the stereo image data. The mixed-resolution coding is a perceptually justified technique that is achieved by presenting one eye with a low-resolution image and the other with a high-resolution image. Psychophysical experiments show that the stereo image pairs with one high-resolution image and one low-resolution image provide almost the same stereo depth to that of a stereo image with two high-resolution images. By combining the mixed-resolution coding and SPT-based disparity-compensation techniques, one reference (left) high-resolution image can be compressed by a hierarchical wavelet transform followed by vector quantization and Huffman encoder. After two level wavelet decompositions, for the lowresolution right image and low-resolution left image, subspace projection technique using the fixed block size disparity compensation estimation is used. At the decoder, the low-resolution right subimage is estimated using the disparity from the low-resolution left subimage. A full-size reconstruction is obtained by upsampling a factor of 4 and reconstructing with the synthesis low pass filter. Finally, experimental results are presented, which show that our scheme achieves a PSNR gain (about 0.92dB) as compared to the current block-based disparity compensation coding techniques.``
Merlin, Emiliano; Grassi, Tommaso; Piovan, Lorenzo; Chiosi, Cesare
2009-01-01
We present EvoL, the new release of the Padova N-body code for cosmological simulations of galaxy formation and evolution. In this paper, the basic Tree + SPH code is presented and analysed, together with an overview on the software architectures. EvoL is a flexible parallel Fortran95 code, specifically designed for simulations of cosmological structure formation on cluster, galactic and sub-galactic scales. EvoL is a fully Lagrangian self-adaptive code, based on the classical Oct-tree and on the Smoothed Particle Hydrodynamics algorithm. It includes special features such as adaptive softening lengths with correcting extra-terms, and modern formulations of SPH and artificial viscosity. It is designed to be run in parallel on multiple CPUs to optimize the performance and save computational time. We describe the code in detail, and present the results of a number of standard hydrodynamical tests.
Institute of Scientific and Technical Information of China (English)
李灵芝; 江晶; 刘志高; 马晓岩
2005-01-01
为了解决大容量雷达数据传输,满足雷达原始视频信号实时无损的要求,根据雷达原始视频信号的特点,给出了采用DPCM(Difference Pulse Coding Modulation)与自适应Huffman编码相结合的压缩编码方式,分析了该算法的有效性和溢出问题,实验表明该方法相对于传统的自适应Huffman编码而言能改善实时性,提高压缩比.
The cosmological simulation code GADGET-2
Springel, V
2005-01-01
We discuss the cosmological simulation code GADGET-2, a new massively parallel TreeSPH code, capable of following a collisionless fluid with the N-body method, and an ideal gas by means of smoothed particle hydrodynamics (SPH). Our implementation of SPH manifestly conserves energy and entropy in regions free of dissipation, while allowing for fully adaptive smoothing lengths. Gravitational forces are computed with a hierarchical multipole expansion, which can optionally be applied in the form of a TreePM algorithm, where only short-range forces are computed with the `tree'-method while long-range forces are determined with Fourier techniques. Time integration is based on a quasi-symplectic scheme where long-range and short-range forces can be integrated with different timesteps. Individual and adaptive short-range timesteps may also be employed. The domain decomposition used in the parallelisation algorithm is based on a space-filling curve, resulting in high flexibility and tree force errors that do not depe...
On Decoding Irregular Tanner Codes
Even, Guy
2011-01-01
We present a new combinatorial characterization for local-optimality of a codeword in irregular Tanner codes. This characterization is a generalization of [Arora, Daskalakis, Steurer; 2009] and [Vontobel; 2010]. The main novelty in this characterization is that it is based on a conical combination of subtrees in the computation trees. These subtrees may have any degree in the local-code nodes and may have any height (even greater than the girth). We prove that local-optimality in this new characterization implies Maximum-Likelihood (ML) optimality and LP-optimality. We also show that it is possible to compute efficiently a certificate for the local-optimality of a codeword given the channel output. We apply this characterization to regular Tanner codes. We prove a lower bound on the noise threshold in channels such as BSC and AWGNC. When the noise is below this lower bound, the probability that LP decoding fails diminishes doubly exponentially in the girth of the Tanner graph. We use local optimality also to ...
Phylogenetic trees in bioinformatics
Energy Technology Data Exchange (ETDEWEB)
Burr, Tom L [Los Alamos National Laboratory
2008-01-01
Genetic data is often used to infer evolutionary relationships among a collection of viruses, bacteria, animal or plant species, or other operational taxonomic units (OTU). A phylogenetic tree depicts such relationships and provides a visual representation of the estimated branching order of the OTUs. Tree estimation is unique for several reasons, including: the types of data used to represent each OTU; the use ofprobabilistic nucleotide substitution models; the inference goals involving both tree topology and branch length, and the huge number of possible trees for a given sample of a very modest number of OTUs, which implies that fmding the best tree(s) to describe the genetic data for each OTU is computationally demanding. Bioinformatics is too large a field to review here. We focus on that aspect of bioinformatics that includes study of similarities in genetic data from multiple OTUs. Although research questions are diverse, a common underlying challenge is to estimate the evolutionary history of the OTUs. Therefore, this paper reviews the role of phylogenetic tree estimation in bioinformatics, available methods and software, and identifies areas for additional research and development.
DEFF Research Database (Denmark)
Brodal, Gerth Stølting; Moruz, Gabriel
2006-01-01
It is well-known that to minimize the number of comparisons a binary search tree should be perfectly balanced. Previous work has shown that a dominating factor over the running time for a search is the number of cache faults performed, and that an appropriate memory layout of a binary search tree...... can reduce the number of cache faults by several hundred percent. Motivated by the fact that during a search branching to the left or right at a node does not necessarily have the same cost, e.g. because of branch prediction schemes, we in this paper study the class of skewed binary search trees....... For all nodes in a skewed binary search tree the ratio between the size of the left subtree and the size of the tree is a fixed constant (a ratio of 1/2 gives perfect balanced trees). In this paper we present an experimental study of various memory layouts of static skewed binary search trees, where each...
Böcker, Sebastian; Dührkop, Kai
2016-01-01
Untargeted metabolomics commonly uses liquid chromatography mass spectrometry to measure abundances of metabolites; subsequent tandem mass spectrometry is used to derive information about individual compounds. One of the bottlenecks in this experimental setup is the interpretation of fragmentation spectra to accurately and efficiently identify compounds. Fragmentation trees have become a powerful tool for the interpretation of tandem mass spectrometry data of small molecules. These trees are determined from the data using combinatorial optimization, and aim at explaining the experimental data via fragmentation cascades. Fragmentation tree computation does not require spectral or structural databases. To obtain biochemically meaningful trees, one needs an elaborate optimization function (scoring). We present a new scoring for computing fragmentation trees, transforming the combinatorial optimization into a Maximum A Posteriori estimator. We demonstrate the superiority of the new scoring for two tasks: both for the de novo identification of molecular formulas of unknown compounds, and for searching a database for structurally similar compounds, our method SIRIUS 3, performs significantly better than the previous version of our method, as well as other methods for this task. SIRIUS 3 can be a part of an untargeted metabolomics workflow, allowing researchers to investigate unknowns using automated computational methods.Graphical abstractWe present a new scoring for computing fragmentation trees from tandem mass spectrometry data based on Bayesian statistics. The best scoring fragmentation tree most likely explains the molecular formula of the measured parent ion.
Institute of Scientific and Technical Information of China (English)
Fan Aihua
2004-01-01
The vertices of an infinite locally finite tree T are labelled by a collection of i.i.d. real random variables {Xσ}σ∈T which defines a tree indexed walk Sσ = ∑θ＜r≤σXr. We introduce and study the oscillations of the walk:Exact Hausdorff dimension of the set of such ξ 's is calculated. An application is given to study the local variation of Brownian motion. A general limsup deviation problem on trees is also studied.
Locally Orderless Registration Code
DEFF Research Database (Denmark)
2012-01-01
This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....
Locally orderless registration code
DEFF Research Database (Denmark)
2012-01-01
This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....
Lossy/lossless coding of bi-level images
DEFF Research Database (Denmark)
Martins, Bo; Forchhammer, Søren
1997-01-01
Summary form only given. We present improvements to a general type of lossless, lossy, and refinement coding of bi-level images (Martins and Forchhammer, 1996). Loss is introduced by flipping pixels. The pixels are coded using arithmetic coding of conditional probabilities obtained using a template...... as is known from JBIG and proposed in JBIG-2 (Martins and Forchhammer). Our new state-of-the-art results are obtained using the more general free tree instead of a template. Also we introduce multiple refinement template coding. The lossy algorithm is analogous to the greedy `rate...
Tree-growth analyses to estimate tree species' drought tolerance
Eilmann, B.; Rigling, A.
2012-01-01
Climate change is challenging forestry management and practices. Among other things, tree species with the ability to cope with more extreme climate conditions have to be identified. However, while environmental factors may severely limit tree growth or even cause tree death, assessing a tree specie
Generalising tree traversals and tree transformations to DAGs
DEFF Research Database (Denmark)
Bahr, Patrick; Axelsson, Emil
2017-01-01
We present a recursion scheme based on attribute grammars that can be transparently applied to trees and acyclic graphs. Our recursion scheme allows the programmer to implement a tree traversal or a tree transformation and then apply it to compact graph representations of trees instead...
Research on Differential Coding Method for Satellite Remote Sensing Data Compression
Lin, Z. J.; Yao, N.; Deng, B.; Wang, C. Z.; Wang, J. H.
2012-07-01
Data compression, in the process of Satellite Earth data transmission, is of great concern to improve the efficiency of data transmission. Information amounts inherent to remote sensing images provide a foundation for data compression in terms of information theory. In particular, distinct degrees of uncertainty inherent to distinct land covers result in the different information amounts. This paper first proposes a lossless differential encoding method to improve compression rates. Then a district forecast differential encoding method is proposed to further improve the compression rates. Considering the stereo measurements in modern photogrammetry are basically accomplished by means of automatic stereo image matching, an edge protection operator is finally utilized to appropriately filter out high frequency noises which could help magnify the signals and further improve the compression rates. The three steps were applied to a Landsat TM multispectral image and a set of SPOT-5 panchromatic images of four typical land cover types (i.e., urban areas, farm lands, mountain areas and water bodies). Results revealed that the average code lengths obtained by the differential encoding method, compared with Huffman encoding, were more close to the information amounts inherent to remote sensing images. And the compression rates were improved to some extent. Furthermore, the compression rates of the four land cover images obtained by the district forecast differential encoding method were nearly doubled. As for the images with the edge features preserved, the compression rates are average four times as large as those of the original images.
Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark
2012-01-01
A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…
Institute of Scientific and Technical Information of China (English)
2008-01-01
Quantum error correcting codes are indispensable for quantum information processing and quantum computation.In 1995 and 1996,Shor and Steane gave first several examples of quantum codes from classical error correcting codes.The construction of efficient quantum codes is now an active multi-discipline research field.In this paper we review the known several constructions of quantum codes and present some examples.
Larson, David; Jacob, Sharon E
2012-01-01
Tea tree oil is an increasingly popular ingredient in a variety of household and cosmetic products, including shampoos, massage oils, skin and nail creams, and laundry detergents. Known for its potential antiseptic properties, it has been shown to be active against a variety of bacteria, fungi, viruses, and mites. The oil is extracted from the leaves of the tea tree via steam distillation. This essential oil possesses a sharp camphoraceous odor followed by a menthol-like cooling sensation. Most commonly an ingredient in topical products, it is used at a concentration of 5% to 10%. Even at this concentration, it has been reported to induce contact sensitization and allergic contact dermatitis reactions. In 1999, tea tree oil was added to the North American Contact Dermatitis Group screening panel. The latest prevalence rates suggest that 1.4% of patients referred for patch testing had a positive reaction to tea tree oil.
Aval, Jean-Christophe; Nadeau, Philippe
2011-01-01
In this work we introduce and study tree-like tableaux, which are certain fillings of Ferrers diagrams in simple bijection with permutation tableaux and alternative tableaux. We exhibit an elementary insertion procedure on our tableaux which gives a clear proof that tableaux of size n are counted by n!, and which moreover respects most of the well-known statistics studied originally on alternative and permutation tableaux. Our insertion procedure allows to define in particular two simple new bijections between tree-like tableaux and permutations: the first one is conceived specifically to respect the generalized pattern 2-31, while the second one respects the underlying tree of a tree-like tableau.
Minnesota Department of Natural Resources — The National Land Cover Database 2001 tree canopy layer for Minnesota (mapping zones 39-42, 50-51) was produced through a cooperative project conducted by the...
Energy Technology Data Exchange (ETDEWEB)
Kelly, K.; White, K.
1981-03-01
An important harvesting alternative in North America is the Full Tree Method, in which trees are felled and transported to roadside, intermediate or primary landings with limbs and branches intact. The acceptance of Full Tree Systems is due to many factors including: labour productivity and increased demands on the forest for ''new products''. These conditions are shaping the future look for forest Harvesting Systems, but must not be the sole determinants. All harvesting implications, such as those affecting Productivity and silviculture, should be thoroughly understood. This paper does not try to discuss every implication, nor any particular one in depth; its purpose is to highlight those areas requiring consideration and to review several current North American Full Tree Systems. (Refs. 5).
1981-01-01
to be Evaluated Manufacturer Location Seismic Susceptibility Flood Susceptibility Temperature Humidity Radiation Wear-out Susceptibility Test...For the category " Seismic Susceptibility," we might define several sensitivity levels ranging from no sensitivity to extreme sensitivity, and for more... Hanford Company, Richland, Wash- ington, ARH-ST-l 12, July 1975. 40. W.E. Vesely, "Analysis of Fault Trees by Kinetic Tree Theory," Idaho Nuclear
DEFF Research Database (Denmark)
Jaeger, Manfred
2006-01-01
We introduce type extension trees as a formal representation language for complex combinatorial features of relational data. Based on a very simple syntax this language provides a unified framework for expressing features as diverse as embedded subgraphs on the one hand, and marginal counts...... of attribute values on the other. We show by various examples how many existing relational data mining techniques can be expressed as the problem of constructing a type extension tree and a discriminant function....
DEFF Research Database (Denmark)
Jaeger, Manfred
2006-01-01
We introduce type extension trees as a formal representation language for complex combinatorial features of relational data. Based on a very simple syntax this language provides a unified framework for expressing features as diverse as embedded subgraphs on the one hand, and marginal counts...... of attribute values on the other. We show by various examples how many existing relational data mining techniques can be expressed as the problem of constructing a type extension tree and a discriminant function....
DEFF Research Database (Denmark)
Somchaipeng, Kerawit; Sporring, Jon; Johansen, Peter
2007-01-01
We propose MultiScale Singularity Trees (MSSTs) as a structure to represent images, and we propose an algorithm for image comparison based on comparing MSSTs. The algorithm is tested on 3 public image databases and compared to 2 state-of-theart methods. We conclude that the computational complexity...... of our algorithm only allows for the comparison of small trees, and that the results of our method are comparable with state-of-the-art using much fewer parameters for image representation....
DEFF Research Database (Denmark)
Schmidt, Lars Holger
Forest tree improvement encompasses a number of scientific and technical areas like floral-, reproductive- and micro-biology, genetics breeding methods and strategies, propagation, gene conservation, data analysis and statistics, each area with a comprehensive terminology. The terms selected...... for definition here are those most frequently used in tree improvement literature. Clonal propagation is included in the view of the great expansion of that field as a means of mass multiplication of improved material....
Manwani, Naresh
2010-01-01
In this paper we present a new algorithm for learning oblique decision trees. Most of the current decision tree algorithms rely on impurity measures to assess the goodness of hyperplanes at each node while learning a decision tree in a top-down fashion. These impurity measures do not properly capture the geometric structures in the data. Motivated by this, our algorithm uses a strategy to assess the hyperplanes in such a way that the geometric structure in the data is taken into account. At each node of the decision tree, we find the clustering hyperplanes for both the classes and use their angle bisectors as the split rule at that node. We show through empirical studies that this idea leads to small decision trees and better performance. We also present some analysis to show that the angle bisectors of clustering hyperplanes that we use as the split rules at each node, are solutions of an interesting optimization problem and hence argue that this is a principled method of learning a decision tree.
2014-01-01
With a view to creating new landscapes and making its population of trees safer and healthier, this winter CERN will complete the tree-felling campaign started in 2010. Tree felling will take place between 15 and 22 November on the Swiss part of the Meyrin site. This work is being carried out above all for safety reasons. The trees to be cut down are at risk of falling as they are too old and too tall to withstand the wind. In addition, the roots of poplar trees are very powerful and spread widely, potentially damaging underground networks, pavements and roadways. Compensatory tree planting campaigns will take place in the future, subject to the availability of funding, with the aim of creating coherent landscapes while also respecting the functional constraints of the site. These matters are being considered in close collaboration with the Geneva nature and countryside directorate (Direction générale de la nature et du paysage, DGNP). GS-SE Group
Attack Trees with Sequential Conjunction
Jhawar, Ravi; Kordy, Barbara; Mauw, Sjouke; Radomirović, Sasa; Trujillo-Rasua, Rolando
2015-01-01
We provide the first formal foundation of SAND attack trees which are a popular extension of the well-known attack trees. The SAND at- tack tree formalism increases the expressivity of attack trees by intro- ducing the sequential conjunctive operator SAND. This operator enables the modeling of
Turbo Codes Extended with Outer BCH Code
DEFF Research Database (Denmark)
Andersen, Jakob Dahl
1996-01-01
The "error floor" observed in several simulations with the turbo codes is verified by calculation of an upper bound to the bit error rate for the ensemble of all interleavers. Also an easy way to calculate the weight enumerator used in this bound is presented. An extended coding scheme is proposed...
Hybrid Noncoherent Network Coding
Skachek, Vitaly; Nedic, Angelia
2011-01-01
We describe a novel extension of subspace codes for noncoherent networks, suitable for use when the network is viewed as a communication system that introduces both dimension and symbol errors. We show that when symbol erasures occur in a significantly large number of different basis vectors transmitted through the network and when the min-cut of the networks is much smaller then the length of the transmitted codewords, the new family of codes outperforms their subspace code counterparts. For the proposed coding scheme, termed hybrid network coding, we derive two upper bounds on the size of the codes. These bounds represent a variation of the Singleton and of the sphere-packing bound. We show that a simple concatenated scheme that represents a combination of subspace codes and Reed-Solomon codes is asymptotically optimal with respect to the Singleton bound. Finally, we describe two efficient decoding algorithms for concatenated subspace codes that in certain cases have smaller complexity than subspace decoder...
Image coding with geometric wavelets.
Alani, Dror; Averbuch, Amir; Dekel, Shai
2007-01-01
This paper describes a new and efficient method for low bit-rate image coding which is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. It combines a binary space partition scheme with geometric wavelet (GW) tree approximation so as to efficiently capture curve singularities and provide a sparse representation of the image. The GW method successfully competes with state-of-the-art wavelet methods such as the EZW, SPIHT, and EBCOT algorithms. We report a gain of about 0.4 dB over the SPIHT and EBCOT algorithms at the bit-rate 0.0625 bits-per-pixels (bpp). It also outperforms other recent methods that are based on "sparse geometric representation." For example, we report a gain of 0.27 dB over the Bandelets algorithm at 0.1 bpp. Although the algorithm is computationally intensive, its time complexity can be significantely reduced by collecting a "global" GW n-term approximation to the image from a collection of GW trees, each constructed separately over tiles of the image.
On the neighbourhoods of trees
Humphries, Peter J
2012-01-01
Tree rearrangement operations typically induce a metric on the space of phylogenetic trees. One important property of these metrics is the size of the neighbourhood, that is, the number of trees exactly one operation from a given tree. We present an expression for the size of the TBR (tree bisection and reconnection) neighbourhood, thus answering a question first posed in [Annals of Combinatorics, 5, 2001 1-15].
Network coding for computing: Linear codes
Appuswamy, Rathinakumar; Karamchandani, Nikhil; Zeger, Kenneth
2011-01-01
In network coding it is known that linear codes are sufficient to achieve the coding capacity in multicast networks and that they are not sufficient in general to achieve the coding capacity in non-multicast networks. In network computing, Rai, Dey, and Shenvi have recently shown that linear codes are not sufficient in general for solvability of multi-receiver networks with scalar linear target functions. We study single receiver networks where the receiver node demands a target function of the source messages. We show that linear codes may provide a computing capacity advantage over routing only when the receiver demands a `linearly-reducible' target function. % Many known target functions including the arithmetic sum, minimum, and maximum are not linearly-reducible. Thus, the use of non-linear codes is essential in order to obtain a computing capacity advantage over routing if the receiver demands a target function that is not linearly-reducible. We also show that if a target function is linearly-reducible,...
The inference of gene trees with species trees.
Szöllősi, Gergely J; Tannier, Eric; Daubin, Vincent; Boussau, Bastien
2015-01-01
This article reviews the various models that have been used to describe the relationships between gene trees and species trees. Molecular phylogeny has focused mainly on improving models for the reconstruction of gene trees based on sequence alignments. Yet, most phylogeneticists seek to reveal the history of species. Although the histories of genes and species are tightly linked, they are seldom identical, because genes duplicate, are lost or horizontally transferred, and because alleles can coexist in populations for periods that may span several speciation events. Building models describing the relationship between gene and species trees can thus improve the reconstruction of gene trees when a species tree is known, and vice versa. Several approaches have been proposed to solve the problem in one direction or the other, but in general neither gene trees nor species trees are known. Only a few studies have attempted to jointly infer gene trees and species trees. These models account for gene duplication and loss, transfer or incomplete lineage sorting. Some of them consider several types of events together, but none exists currently that considers the full repertoire of processes that generate gene trees along the species tree. Simulations as well as empirical studies on genomic data show that combining gene tree-species tree models with models of sequence evolution improves gene tree reconstruction. In turn, these better gene trees provide a more reliable basis for studying genome evolution or reconstructing ancestral chromosomes and ancestral gene sequences. We predict that gene tree-species tree methods that can deal with genomic data sets will be instrumental to advancing our understanding of genomic evolution.
Du, Ding-Zhu
2001-01-01
This book is a collection of articles studying various Steiner tree prob lems with applications in industries, such as the design of electronic cir cuits, computer networking, telecommunication, and perfect phylogeny. The Steiner tree problem was initiated in the Euclidean plane. Given a set of points in the Euclidean plane, the shortest network interconnect ing the points in the set is called the Steiner minimum tree. The Steiner minimum tree may contain some vertices which are not the given points. Those vertices are called Steiner points while the given points are called terminals. The shortest network for three terminals was first studied by Fermat (1601-1665). Fermat proposed the problem of finding a point to minimize the total distance from it to three terminals in the Euclidean plane. The direct generalization is to find a point to minimize the total distance from it to n terminals, which is still called the Fermat problem today. The Steiner minimum tree problem is an indirect generalization. Sch...
Bose, Prosenjit; Douieb, Karim; Dujmovic, Vida; King, James; Morin, Pat
2010-01-01
Let R^d -> A be a query problem over R^d for which there exists a data structure S that can compute P(q) in O(log n) time for any query point q in R^d. Let D be a probability measure over R^d representing a distribution of queries. We describe a data structure called the odds-on tree, of size O(n^\\epsilon) that can be used as a filter that quickly computes P(q) for some query values q in R^d and relies on S for the remaining queries. With an odds-on tree, the expected query time for a point drawn according to D is O(H*+1), where H* is a lower-bound on the expected cost of any linear decision tree that solves P. Odds-on trees have a number of applications, including distribution-sensitive data structures for point location in 2-d, point-in-polytope testing in d dimensions, ray shooting in simple polygons, ray shooting in polytopes, nearest-neighbour queries in R^d, point-location in arrangements of hyperplanes in R^d, and many other geometric searching problems that can be solved in the linear-decision tree mo...
Zhu, Ruoqing; Zeng, Donglin; Kosorok, Michael R
In this paper, we introduce a new type of tree-based method, reinforcement learning trees (RLT), which exhibits significantly improved performance over traditional methods such as random forests (Breiman, 2001) under high-dimensional settings. The innovations are three-fold. First, the new method implements reinforcement learning at each selection of a splitting variable during the tree construction processes. By splitting on the variable that brings the greatest future improvement in later splits, rather than choosing the one with largest marginal effect from the immediate split, the constructed tree utilizes the available samples in a more efficient way. Moreover, such an approach enables linear combination cuts at little extra computational cost. Second, we propose a variable muting procedure that progressively eliminates noise variables during the construction of each individual tree. The muting procedure also takes advantage of reinforcement learning and prevents noise variables from being considered in the search for splitting rules, so that towards terminal nodes, where the sample size is small, the splitting rules are still constructed from only strong variables. Last, we investigate asymptotic properties of the proposed method under basic assumptions and discuss rationale in general settings.
Berestovskii, V N
2007-01-01
We show that every inner metric space X is the metric quotient of a complete R-tree via a free isometric action, which we call the covering R-tree of X. The quotient mapping is a weak submetry (hence, open) and light. In the case of compact 1-dimensional geodesic space X, the free isometric action is via a subgroup of the fundamental group of X. In particular, the Sierpin'ski gasket and carpet, and the Menger sponge all have the same covering R-tree, which is complete and has at each point valency equal to the continuum. This latter R-tree is of particular interest because it is "universal" in at least two senses: First, every R-tree of valency at most the continuum can be isometrically embedded in it. Second, every Peano continuum is the image of it via an open light mapping. We provide a sketch of our previous construction of the uniform universal cover in the special case of inner metric spaces, the properties of which are used in the proof.
Practices in Code Discoverability
Teuben, Peter; Nemiroff, Robert J; Shamir, Lior
2012-01-01
Much of scientific progress now hinges on the reliability, falsifiability and reproducibility of computer source codes. Astrophysics in particular is a discipline that today leads other sciences in making useful scientific components freely available online, including data, abstracts, preprints, and fully published papers, yet even today many astrophysics source codes remain hidden from public view. We review the importance and history of source codes in astrophysics and previous efforts to develop ways in which information about astrophysics codes can be shared. We also discuss why some scientist coders resist sharing or publishing their codes, the reasons for and importance of overcoming this resistance, and alert the community to a reworking of one of the first attempts for sharing codes, the Astrophysics Source Code Library (ASCL). We discuss the implementation of the ASCL in an accompanying poster paper. We suggest that code could be given a similar level of referencing as data gets in repositories such ...
Djordjevic, Ivan; Vasic, Bane
2010-01-01
This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.
Blind and readable image watermarking using wavelet tree quantization
Institute of Scientific and Technical Information of China (English)
HU Yuping; YU Shengsheng; ZHOU JingLi; SHI Lei
2004-01-01
A blind and readable image watermarking scheme using wavelet tree quantization is proposed. In order to increase the algorithm robustness and ensure the watermark integrity,error correction coding techniques are used to encode the embedded watermark. In the watermark embedding process, the wavelet coefficients of the host image are grouped into wavelet trees and each watermark bit is embedded by using two trees. The trees are so quantized that they exhibit a large enough statistical difference, which will later be used for watermark extraction. The experimental results show that the proposed algorithm is effective and robust to common image processing operations and some geometric operations such as JPEG compression,JPEG2000 compression, filtering, Gaussian noise attack, and row-column removal. It is demonstrated that the proposed technique is practical.
Zhang, Linfan; Zheng, Shuang
2015-01-01
Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...
Visualization of Uncertain Contour Trees
DEFF Research Database (Denmark)
Kraus, Martin
2010-01-01
Contour trees can represent the topology of large volume data sets in a relatively compact, discrete data structure. However, the resulting trees often contain many thousands of nodes; thus, many graph drawing techniques fail to produce satisfactory results. Therefore, several visualization methods...... were proposed recently for the visualization of contour trees. Unfortunately, none of these techniques is able to handle uncertain contour trees although any uncertainty of the volume data inevitably results in partially uncertain contour trees. In this work, we visualize uncertain contour trees...... by combining the contour trees of two morphologically filtered versions of a volume data set, which represent the range of uncertainty. These two contour trees are combined and visualized within a single image such that a range of potential contour trees is represented by the resulting visualization. Thus...
OCTGRAV: Sparse Octree Gravitational N-body Code on Graphics Processing Units
Gaburov, Evghenii; Bédorf, Jeroen; Portegies Zwart, Simon
2010-10-01
Octgrav is a new very fast tree-code which runs on massively parallel Graphical Processing Units (GPU) with NVIDIA CUDA architecture. The algorithms are based on parallel-scan and sort methods. The tree-construction and calculation of multipole moments is carried out on the host CPU, while the force calculation which consists of tree walks and evaluation of interaction list is carried out on the GPU. In this way, a sustained performance of about 100GFLOP/s and data transfer rates of about 50GB/s is achieved. It takes about a second to compute forces on a million particles with an opening angle of heta approx 0.5. To test the performance and feasibility, we implemented the algorithms in CUDA in the form of a gravitational tree-code which completely runs on the GPU. The tree construction and traverse algorithms are portable to many-core devices which have support for CUDA or OpenCL programming languages. The gravitational tree-code outperforms tuned CPU code during the tree-construction and shows a performance improvement of more than a factor 20 overall, resulting in a processing rate of more than 2.8 million particles per second. The code has a convenient user interface and is freely available for use.
Forrow, Aden; Dunkel, Jörn
2016-01-01
Coherent, large scale dynamics in many nonequilibrium physical, biological, or information transport networks are driven by small-scale local energy input. We introduce and explore a generic model for compressible active flows on tree networks. In contrast to thermally-driven systems, active friction selects discrete states with only a small number of oscillation modes activated at distinct fixed amplitudes. This state selection interacts with graph topology to produce different localized dynamical time scales in separate regions of large networks. Using perturbation theory, we systematically predict the stationary states of noisy networks and find good agreement with a Bayesian state estimation based on a hidden Markov model applied to simulated time series data on binary trees. While the number of stable states per tree scales exponentially with the number of edges, the mean number of activated modes in each state averages $\\sim 1/4$ the number of edges. More broadly, these results suggest that the macrosco...
DEFF Research Database (Denmark)
Durhuus, Bergfinnur Jøgvan; Napolitano, George Maria
2012-01-01
The Ising model on a class of infinite random trees is defined as a thermodynamiclimit of finite systems. A detailed description of the corresponding distribution of infinite spin configurations is given. As an application, we study the magnetization properties of such systems and prove that they......The Ising model on a class of infinite random trees is defined as a thermodynamiclimit of finite systems. A detailed description of the corresponding distribution of infinite spin configurations is given. As an application, we study the magnetization properties of such systems and prove...... that they exhibit no spontaneous magnetization. Furthermore, the values of the Hausdorff and spectral dimensions of the underlying trees are calculated and found to be, respectively,¯dh =2 and¯ds = 4/3....
Roux, Kenneth H; Teuber, Suzanne S; Sathe, Shridhar K
2003-08-01
Allergic reactions to tree nuts can be serious and life threatening. Considerable research has been conducted in recent years in an attempt to characterize those allergens that are most responsible for allergy sensitization and triggering. Both native and recombinant nut allergens have been identified and characterized and, for some, the IgE-reactive epitopes described. Some allergens, such as lipid transfer proteins, profilins, and members of the Bet v 1-related family, represent minor constituents in tree nuts. These allergens are frequently cross-reactive with other food and pollen homologues, and are considered panallergens. Others, such as legumins, vicilins, and 2S albumins, represent major seed storage protein constituents of the nuts. The allergenic tree nuts discussed in this review include those most commonly responsible for allergic reactions such as hazelnut, walnut, cashew, and almond as well as those less frequently associated with allergies including pecan, chestnut, Brazil nut, pine nut, macadamia nut, pistachio, coconut, Nangai nut, and acorn.
A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok
2001-01-01
textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from
Bergstra, Jan A
2010-01-01
General definitions as well as rules of reasoning regarding control code production, distribution, deployment, and usage are described. The role of testing, trust, confidence and risk analysis is considered. A rationale for control code testing is sought and found for the case of safety critical embedded control code.
DEFF Research Database (Denmark)
Bombin Palomo, Hector
2015-01-01
Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...
Deursen, A. van; Moonen, L.M.F.; Bergh, A. van den; Kok, G.
2001-01-01
Two key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from refactoring product
Efficient image compression scheme based on differential coding
Zhu, Li; Wang, Guoyou; Liu, Ying
2007-11-01
Embedded zerotree (EZW) and Set Partitioning in Hierarchical Trees (SPIHT) coding, introduced by J.M. Shapiro and Amir Said, are very effective and being used in many fields widely. In this study, brief explanation of the principles of SPIHT was first provided, and then, some improvement of SPIHT algorithm according to experiments was introduced. 1) For redundancy among the coefficients in the wavelet region, we propose differential method to reduce it during coding. 2) Meanwhile, based on characteristic of the coefficients' distribution in subband, we adjust sorting pass and optimize differential coding, in order to reduce the redundancy coding in each subband. 3) The image coding result, calculated by certain threshold, shows that through differential coding, the rate of compression get higher, and the quality of reconstructed image have been get raised greatly, when bpp (bit per pixel)=0.5, PSNR (Peak Signal to Noise Ratio) of reconstructed image exceeds that of standard SPIHT by 0.2~0.4db.
ARC Code TI: CODE Software Framework
National Aeronautics and Space Administration — CODE is a software framework for control and observation in distributed environments. The basic functionality of the framework allows a user to observe a distributed...
ARC Code TI: ROC Curve Code Augmentation
National Aeronautics and Space Administration — ROC (Receiver Operating Characteristic) curve Code Augmentation was written by Rodney Martin and John Stutz at NASA Ames Research Center and is a modification of ROC...
Fountain Codes: LT And Raptor Codes Implementation
Directory of Open Access Journals (Sweden)
Ali Bazzi, Hiba Harb
2017-01-01
Full Text Available Digital fountain codes are a new class of random error correcting codes designed for efficient and reliable data delivery over erasure channels such as internet. These codes were developed to provide robustness against erasures in a way that resembles a fountain of water. A digital fountain is rateless in a way that sender can send limitless number of encoded packets. The receiver doesn’t care which packets are received or lost as long as the receiver gets enough packets to recover original data. In this paper, the design of the fountain codes is explored with its implementation of the encoding and decoding algorithm so that the performance in terms of encoding/decoding symbols, reception overhead, data length, and failure probability is studied.
DEFF Research Database (Denmark)
Lavstsen, Thomas; Salanti, Ali; Jensen, Anja T R;
2003-01-01
and organization of the 3D7 PfEMP1 repertoire was investigated on the basis of the complete genome sequence. METHODS: Using two tree-building methods we analysed the coding and non-coding sequences of 3D7 var and rif genes as well as var genes of other parasite strains. RESULTS: var genes can be sub...
Difference and dynamic binarization of binary arithmetic coding%差分动态二进制化的二进制算数编码
Institute of Scientific and Technical Information of China (English)
吴江铭
2013-01-01
It provides an overview of the high efficiency compression method CABAC proposed in HEVC which will be published by JCT-VC.Then it optimizes the binarization process of binary arithmetic coding by dynamic Huffman coding and makes the difference before the binarization.At last,it demonstrates the experimental results in comparison with the PAQ to validate the efficiency of the new method difference and dynamic binarization of binary arithmetic coding.%JCT-VC组织公布的HEVC协议草案沿用了H264的CABAC,改进了二进制化过程.在阐述高性能压缩算法CABAC的同时,创新性地提出了动态二进制化算数编码,并预先对数据进行差分.最后,通过压缩Java文件实验证实差分动态二进制化算数编码在压缩率方面有较大的提高,高于PAQ和CABAC.
Adaptive Context Tree Weighting
O'Neill, Alexander; Shao, Wen; Sunehag, Peter
2012-01-01
We describe an adaptive context tree weighting (ACTW) algorithm, as an extension to the standard context tree weighting (CTW) algorithm. Unlike the standard CTW algorithm, which weights all observations equally regardless of the depth, ACTW gives increasing weight to more recent observations, aiming to improve performance in cases where the input sequence is from a non-stationary distribution. Data compression results show ACTW variants improving over CTW on merged files from standard compression benchmark tests while never being significantly worse on any individual file.
Universal Rateless Codes From Coupled LT Codes
Aref, Vahid
2011-01-01
It was recently shown that spatial coupling of individual low-density parity-check codes improves the belief-propagation threshold of the coupled ensemble essentially to the maximum a posteriori threshold of the underlying ensemble. We study the performance of spatially coupled low-density generator-matrix ensembles when used for transmission over binary-input memoryless output-symmetric channels. We show by means of density evolution that the threshold saturation phenomenon also takes place in this setting. Our motivation for studying low-density generator-matrix codes is that they can easily be converted into rateless codes. Although there are already several classes of excellent rateless codes known to date, rateless codes constructed via spatial coupling might offer some additional advantages. In particular, by the very nature of the threshold phenomenon one expects that codes constructed on this principle can be made to be universal, i.e., a single construction can uniformly approach capacity over the cl...
Software Certification - Coding, Code, and Coders
Havelund, Klaus; Holzmann, Gerard J.
2011-01-01
We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.
Oflazer, K
1996-01-01
This paper presents an efficient algorithm for retrieving from a database of trees, all trees that match a given query tree approximately, that is, within a certain error tolerance. It has natural language processing applications in searching for matches in example-based translation systems, and retrieval from lexical databases containing entries of complex feature structures. The algorithm has been implemented on SparcStations, and for large randomly generated synthetic tree databases (some having tens of thousands of trees) it can associatively search for trees with a small error, in a matter of tenths of a second to few seconds.
Rice, R. F.; Lee, J. J.
1986-01-01
Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.
Lim, Sung Hoon; Gamal, Abbas El; Chung, Sae-Young
2010-01-01
A noisy network coding scheme for sending multiple sources over a general noisy network is presented. For multi-source multicast networks, the scheme naturally extends both network coding over noiseless networks by Ahlswede, Cai, Li, and Yeung, and compress-forward coding for the relay channel by Cover and El Gamal to general discrete memoryless and Gaussian networks. The scheme also recovers as special cases the results on coding for wireless relay networks and deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme involves message repetition coding, relay signal compression, and simultaneous decoding. Unlike previous compress--forward schemes, where independent messages are sent over multiple blocks, the same message is sent multiple times using independent codebooks as in the network coding scheme for cyclic networks. Furthermore, the relays do not use Wyner--Ziv binning as in previous compress-forward sch...
Testing algebraic geometric codes
Institute of Scientific and Technical Information of China (English)
CHEN Hao
2009-01-01
Property testing was initially studied from various motivations in 1990's.A code C (∩)GF(r)n is locally testable if there is a randomized algorithm which can distinguish with high possibility the codewords from a vector essentially far from the code by only accessing a very small (typically constant) number of the vector's coordinates.The problem of testing codes was firstly studied by Blum,Luby and Rubinfeld and closely related to probabilistically checkable proofs (PCPs).How to characterize locally testable codes is a complex and challenge problem.The local tests have been studied for Reed-Solomon (RS),Reed-Muller (RM),cyclic,dual of BCH and the trace subcode of algebraicgeometric codes.In this paper we give testers for algebraic geometric codes with linear parameters (as functions of dimensions).We also give a moderate condition under which the family of algebraic geometric codes cannot be locally testable.
Institute of Scientific and Technical Information of China (English)
ZHANG Aili; LIU Xiufeng
2006-01-01
Chinese remainder codes are constructed by applying weak block designs and the Chinese remainder theorem of ring theory.The new type of linear codes take the congruence class in the congruence class ring R/I1 ∩ I2 ∩…∩ In for the information bit,embed R/Ji into R/I1 ∩ I2 ∩…∩ In,and assign the cosets of R/Ji as the subring of R/I1 ∩ I2 ∩…∩ In and the cosets of R/Ji in R/I1 ∩ I2 ∩…∩ In as check lines.Many code classes exist in the Chinese remainder codes that have high code rates.Chinese remainder codes are the essential generalization of Sun Zi codes.
Institute of Scientific and Technical Information of China (English)
张爱丽; 刘秀峰; 靳蕃
2004-01-01
Chinese Remainder Codes are constructed by applying weak block designs and Chinese Remainder Theorem of ring theory. The new type of linear codes take the congruence class in the congruence class ring R/I1∩I2∩…∩In for the information bit, embed R/Ji into R/I1∩I2∩…∩In, and asssign the cosets of R/Ji as the subring of R/I1∩I2∩…∩In and the cosets of R/Ji in R/I1∩I2∩…∩In as check lines. There exist many code classes in Chinese Remainder Codes, which have high code rates. Chinese Remainder Codes are the essential generalization of Sun Zi Codes.
DEFF Research Database (Denmark)
Adelstein, Jennifer; Clegg, Stewart
2016-01-01
Ethical codes have been hailed as an explicit vehicle for achieving more sustainable and defensible organizational practice. Nonetheless, when legal compliance and corporate governance codes are conflated, codes can be used to define organizational interests ostentatiously by stipulating norms...... for employee ethics. Such codes have a largely cosmetic and insurance function, acting subtly and strategically to control organizational risk management and protection. In this paper, we conduct a genealogical discourse analysis of a representative code of ethics from an international corporation...... to understand how management frames expectations of compliance. Our contribution is to articulate the problems inherent in codes of ethics, and we make some recommendations to address these to benefit both an organization and its employees. In this way, we show how a code of ethics can provide a foundation...
Defeating the coding monsters.
Colt, Ross
2007-02-01
Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.
Testing algebraic geometric codes
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
Property testing was initially studied from various motivations in 1990’s. A code C GF (r)n is locally testable if there is a randomized algorithm which can distinguish with high possibility the codewords from a vector essentially far from the code by only accessing a very small (typically constant) number of the vector’s coordinates. The problem of testing codes was firstly studied by Blum, Luby and Rubinfeld and closely related to probabilistically checkable proofs (PCPs). How to characterize locally testable codes is a complex and challenge problem. The local tests have been studied for Reed-Solomon (RS), Reed-Muller (RM), cyclic, dual of BCH and the trace subcode of algebraicgeometric codes. In this paper we give testers for algebraic geometric codes with linear parameters (as functions of dimensions). We also give a moderate condition under which the family of algebraic geometric codes cannot be locally testable.
Serially Concatenated IRA Codes
Cheng, Taikun; Belzer, Benjamin J
2007-01-01
We address the error floor problem of low-density parity check (LDPC) codes on the binary-input additive white Gaussian noise (AWGN) channel, by constructing a serially concatenated code consisting of two systematic irregular repeat accumulate (IRA) component codes connected by an interleaver. The interleaver is designed to prevent stopping-set error events in one of the IRA codes from propagating into stopping set events of the other code. Simulations with two 128-bit rate 0.707 IRA component codes show that the proposed architecture achieves a much lower error floor at higher SNRs, compared to a 16384-bit rate 1/2 IRA code, but incurs an SNR penalty of about 2 dB at low to medium SNRs. Experiments indicate that the SNR penalty can be reduced at larger blocklengths.
Enforcing the use of API functions in Linux code
DEFF Research Database (Denmark)
Lawall, Julia; Muller, Gilles; Palix, Nicolas Jean-Michel
2009-01-01
In the Linux kernel source tree, header files typically define many small functions that have a simple behavior but are critical to ensure readability, correctness, and maintainability. We have observed, however, that some Linux code does not use these functions systematically. In this paper, we...... in the header file include/linux/usb.h....
A Suffix Tree Or Not a Suffix Tree?
DEFF Research Database (Denmark)
Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel
2015-01-01
, in particular we do not require that S ends with a unique symbol. This corresponds to considering the more general definition of implicit or extended suffix trees. Such general suffix trees have many applications and are for example needed to allow efficient updates when suffix trees are built online. We prove...
Tree Modeling with Real Tree-Parts Examples.
Xie, Ke; Yan, Feilong; Sharf, Andrei; Deussen, Oliver; Huang, Hui; Chen, Baoquan
2016-12-01
We introduce a 3D tree modeling technique that utilizes examples of real trees to enhance tree creation with realistic structures and fine-level details. In contrast to previous works that use smooth generalized cylinders to represent tree branches, our method generates realistic looking tree models with complex branching geometry by employing an exemplar database consisting of real-life trees reconstructed from scanned data. These trees are sliced into representative parts (denoted as tree-cuts), representing trunk logs and branching structures. In the modeling process, tree-cuts are positioned in space in an intuitive manner, serving as efficient proxies that guide the creation of the complete tree. Allometry rules are taken into account to ensure reasonable relations between adjacent branches. Realism is further enhanced by automatically transferring geometric textures from our database onto tree branches as well as by guided growing of foliage. Our results demonstrate the complexity and variety of trees that can be generated with our method within few minutes. We carry a user study to test the effectiveness of our modeling technique.
Balgooy, van M.M.J.
1998-01-01
With the publication of the second volume of the series ‘Malesian Seed Plants’, entitled ‘Portraits of Tree Families’, I would like to refer to the Introduction of the first volume, ‘Spot-characters’ for a historical background and an explanation of the aims of this series. The present book treats 1
Certified Kruskal's Tree Theorem
Directory of Open Access Journals (Sweden)
Christian Sternagel
2014-07-01
Full Text Available This article presents the first formalization of Kurskal's tree theorem in aproof assistant. The Isabelle/HOL development is along the lines of Nash-Williams' original minimal bad sequence argument for proving the treetheorem. Along the way, proofs of Dickson's lemma and Higman's lemma, as well as some technical details of the formalization are discussed.
Institute of Scientific and Technical Information of China (English)
2009-01-01
west of Tiananmen Square in Beijing, in Zhongshan Park, there stand several ancient cypress trees, each more than 1,000 years old. Their leafy crowns are all more than 20 meters high, while four have trunks that are 6 meters in circumference. The most unique of these
DEFF Research Database (Denmark)
Assent, Ira; Krieger, Ralph; Afschari, Farzad;
2008-01-01
Continuous growth in sensor data and other temporal data increases the importance of retrieval and similarity search in time series data. Efficient time series query processing is crucial for interactive applications. Existing multidimensional indexes like the R-tree provide efficient querying fo...
DEFF Research Database (Denmark)
Assent, Ira; Krieger, Ralph; Afschari, Farzad
2008-01-01
Continuous growth in sensor data and other temporal data increases the importance of retrieval and similarity search in time series data. Efficient time series query processing is crucial for interactive applications. Existing multidimensional indexes like the R-tree provide efficient querying fo...
Tree Transduction Tools for Cdec
Directory of Open Access Journals (Sweden)
Austin Matthews
2014-09-01
Full Text Available We describe a collection of open source tools for learning tree-to-string and tree-to-tree transducers and the extensions to the cdec decoder that enable translation with these. Our modular, easy-to-extend tools extract rules from trees or forests aligned to strings and trees subject to different structural constraints. A fast, multithreaded implementation of the Cohn and Blunsom (2009 model for extracting compact tree-to-string rules is also included. The implementation of the tree composition algorithm used by cdec is described, and translation quality and decoding time results are presented. Our experimental results add to the body of evidence suggesting that tree transducers are a compelling option for translation, particularly when decoding speed and translation model size are important.
Tree Formation Using Coordinate Method
Directory of Open Access Journals (Sweden)
Monika Choudhary
2015-06-01
Full Text Available In this paper we are introducing a new method of tree formation, we propose a coordinate based method by which we can store and access tree structures. As we know in NLP, parsing is the most important module. The output of this module is generally parsed trees. Currently, TAG (Tree Adjoining Grammar is widely used grammar due to its linguistic and formal nature. It is simply tree generating system. The unit structure used in TAG is structured trees. So we used our new method to store trees where we worked on English to Hindi language. We worked on different sentences from English to Hindi, our method is the easiest way to manipulate tree. We have implemented within small corpus and for finite number of structures and further can be extended in future.
Hybrid Parallel Contour Trees, Version 1.0
Energy Technology Data Exchange (ETDEWEB)
2017-01-03
A common operation in scientific visualization is to compute and render a contour of a data set. Given a function of the form f : R^d -> R, a level set is defined as an inverse image f^-1(h) for an isovalue h, and a contour is a single connected component of a level set. The Reeb graph can then be defined to be the result of contracting each contour to a single point, and is well defined for Euclidean spaces or for general manifolds. For simple domains, the graph is guaranteed to be a tree, and is called the contour tree. Analysis can then be performed on the contour tree in order to identify isovalues of particular interest, based on various metrics, and render the corresponding contours, without having to know such isovalues a priori. This code is intended to be the first data-parallel algorithm for computing contour trees. Our implementation will use the portable data-parallel primitives provided by Nvidia’s Thrust library, allowing us to compile our same code for both GPUs and multi-core CPUs. Native OpenMP and purely serial versions of the code will likely also be included. It will also be extended to provide a hybrid data-parallel / distributed algorithm, allowing scaling beyond a single GPU or CPU.
DEFF Research Database (Denmark)
Andersen, Esben Sloth
2002-01-01
The purpose of this paper is to bring forth an interaction between evolutionary economics and industrial systematics. The suggested solution is to reconstruct the "family tree" of the industries. Such a tree is based on similarities, but it may also reflect the evolutionary history in industries...... finding of optimal industrial trees. The results are presented as taxonomic trees that can easily be compared with the hierarchical structure of existing systems of industrial classification....
GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY
Energy Technology Data Exchange (ETDEWEB)
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi [Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics, Stanford University, Stanford, CA 94305 (United States); Busha, Michael T. [Institute for Theoretical Physics, University of Zurich, CH-8006 Zurich (Switzerland); Klypin, Anatoly A. [Astronomy Department, New Mexico State University, Las Cruces, NM 88003 (United States); Primack, Joel R., E-mail: behroozi@stanford.edu, E-mail: rwechsler@stanford.edu [Department of Physics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States)
2013-01-20
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Bounds on Average Time Complexity of Decision Trees
Chikalov, Igor
2011-01-01
In this chapter, bounds on the average depth and the average weighted depth of decision trees are considered. Similar problems are studied in search theory [1], coding theory [77], design and analysis of algorithms (e.g., sorting) [38]. For any diagnostic problem, the minimum average depth of decision tree is bounded from below by the entropy of probability distribution (with a multiplier 1/log2 k for a problem over a k-valued information system). Among diagnostic problems, the problems with a complete set of attributes have the lowest minimum average depth of decision trees (e.g, the problem of building optimal prefix code [1] and a blood test study in assumption that exactly one patient is ill [23]). For such problems, the minimum average depth of decision tree exceeds the lower bound by at most one. The minimum average depth reaches the maximum on the problems in which each attribute is "indispensable" [44] (e.g., a diagnostic problem with n attributes and kn pairwise different rows in the decision table and the problem of implementing the modulo 2 summation function). These problems have the minimum average depth of decision tree equal to the number of attributes in the problem description. © Springer-Verlag Berlin Heidelberg 2011.
Gravitationally Consistent Halo Catalogs and Merger Trees for Precision Cosmology
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.
2013-01-01
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Reddy, Sushma; Kimball, Rebecca T; Pandey, Akanksha; Hosner, Peter A; Braun, Michael J; Hackett, Shannon J; Han, Kin-Lan; Harshman, John; Huddleston, Christopher J; Kingston, Sarah; Marks, Ben D; Miglia, Kathleen J; Moore, William S; Sheldon, Frederick H; Witt, Christopher C; Yuri, Tamaki; Braun, Edward L
2017-09-01
Phylogenomics, the use of large-scale data matrices in phylogenetic analyses, has been viewed as the ultimate solution to the problem of resolving difficult nodes in the tree of life. However, it has become clear that analyses of these large genomic data sets can also result in conflicting estimates of phylogeny. Here, we use the early divergences in Neoaves, the largest clade of extant birds, as a "model system" to understand the basis for incongruence among phylogenomic trees. We were motivated by the observation that trees from two recent avian phylogenomic studies exhibit conflicts. Those studies used different strategies: 1) collecting many characters [$\\sim$ 42 mega base pairs (Mbp) of sequence data] from 48 birds, sometimes including only one taxon for each major clade; and 2) collecting fewer characters ($\\sim$ 0.4 Mbp) from 198 birds, selected to subdivide long branches. However, the studies also used different data types: the taxon-poor data matrix comprised 68% non-coding sequences whereas coding exons dominated the taxon-rich data matrix. This difference raises the question of whether the primary reason for incongruence is the number of sites, the number of taxa, or the data type. To test among these alternative hypotheses we assembled a novel, large-scale data matrix comprising 90% non-coding sequences from 235 bird species. Although increased taxon sampling appeared to have a positive impact on phylogenetic analyses the most important variable was data type. Indeed, by analyzing different subsets of the taxa in our data matrix we found that increased taxon sampling actually resulted in increased congruence with the tree from the previous taxon-poor study (which had a majority of non-coding data) instead of the taxon-rich study (which largely used coding data). We suggest that the observed differences in the estimates of topology for these studies reflect data-type effects due to violations of the models used in phylogenetic analyses, some of which
Protecting Trees Means Protecting Ourselves
Institute of Scientific and Technical Information of China (English)
刘国虹; 张超
2016-01-01
As everyone knows,spring is a planting season.Every year people all over China go out to plant trees.Trees can make our environment more beautifully~①.Trees can stop wind from blowing the earth and sand away.They can also prevent soil from being washed away by wa-
Junction trees of general graphs
Institute of Scientific and Technical Information of China (English)
Xiaofei WANG; Jianhua GUO
2008-01-01
In this paper,we study the maximal prime subgraphs and their corresponding structure for any undirected graph.We introduce the notion of junction trees and investigate their structural characteristics,including junction properties,induced-subtree properties,running-intersection properties and maximum-weight spanning tree properties.Furthermore,the characters of leaves and edges on junction trees are discussed.
Tree decompositions with small cost
Bodlaender, H.L.; Fomin, F.V.
2002-01-01
The f-cost of a tree decomposition ({Xi | i e I}, T = (I;F)) for a function f : N -> R+ is defined as EieI f(|Xi|). This measure associates with the running time or memory use of some algorithms that use the tree decomposition. In this paper we investigate the problem to find tree decompositions
Distance labeling schemes for trees
DEFF Research Database (Denmark)
Alstrup, Stephen; Gørtz, Inge Li; Bistrup Halvorsen, Esben;
2016-01-01
We consider distance labeling schemes for trees: given a tree with n nodes, label the nodes with binary strings such that, given the labels of any two nodes, one can determine, by looking only at the labels, the distance in the tree between the two nodes. A lower bound by Gavoille et al. [Gavoill...
Institute of Scientific and Technical Information of China (English)
孟凡洪; 苏耕; 杨继
2000-01-01
The present paper shows the coordinates of a tree and its vertices, defines a kind of Trees with Odd-Number Radiant Type (TONRT), deals with the gracefulness of TONRT by using the edge-moving theorem, and uses graceful TONRT to construct another class of graceful trees.
Generalising tree traversals to DAGs
DEFF Research Database (Denmark)
Bahr, Patrick; Axelsson, Emil
2015-01-01
We present a recursion scheme based on attribute grammars that can be transparently applied to trees and acyclic graphs. Our recursion scheme allows the programmer to implement a tree traversal and then apply it to compact graph representations of trees instead. The resulting graph traversals avoid...
Spanning trees crossing few barriers
Asano, T.; Berg, M. de; Cheong, O.; Guibas, L.J.; Snoeyink, J.; Tamaki, H.
2002-01-01
We consider the problem of finding low-cost spanning trees for sets of n points in the plane, where the cost of a spanning tree is defined as the total number of intersections of tree edges with a given set of m barriers. We obtain the following results: (i) if the barriers are possibly intersecting
Selecting Landscape Plants: Flowering Trees
Relf, Diane; Appleton, Bonnie Lee, 1948-2012
2009-01-01
This publication helps the reader to select wisely among the many species and varieties of flowering trees available. The following are considerations that should be taken into account when choosing flowering trees for the home landscape: selections factors, environmental responses, availability and adaptability, and flowering tree descriptions.
Rectilinear Full Steiner Tree Generation
DEFF Research Database (Denmark)
Zachariasen, Martin
1999-01-01
The fastest exact algorithm (in practice) for the rectilinear Steiner tree problem in the plane uses a two-phase scheme: First, a small but sufficient set of full Steiner trees (FSTs) is generated and then a Steiner minimum tree is constructed from this set by using simple backtrack search, dynamic...
Generalising tree traversals to DAGs
DEFF Research Database (Denmark)
Bahr, Patrick; Axelsson, Emil
2015-01-01
We present a recursion scheme based on attribute grammars that can be transparently applied to trees and acyclic graphs. Our recursion scheme allows the programmer to implement a tree traversal and then apply it to compact graph representations of trees instead. The resulting graph traversals avoid...
Nyhuis, Jane
Referring as often as possible to traditional Hopi practices and to materials readily available on the reservation, the illustrated booklet provides information on the care and maintenance of young fruit trees. An introduction to fruit trees explains the special characteristics of new trees, e.g., grafting, planting pits, and watering. The…
Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter
2017-09-08
The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.
Yu, Jiun-Hung
2012-01-01
Polynomial remainder codes are a large class of codes derived from the Chinese remainder theorem that includes Reed-Solomon codes as a special case. In this paper, we revisit these codes and study them more carefully than in previous work. We explicitly allow the code symbols to be polynomials of different degrees, which leads to two different notions of weight and distance. Algebraic decoding is studied in detail. If the moduli are not irreducible, the notion of an error locator polynomial is replaced by an error factor polynomial. We then obtain a collection of gcd-based decoding algorithms, some of which are not quite standard even when specialized to Reed-Solomon codes.
A suffix tree or not a suffix tree?
DEFF Research Database (Denmark)
Starikovskaya, Tatiana; Vildhøj, Hjalte Wedel
2015-01-01
In this paper we study the structure of suffix trees. Given an unlabeled tree τ on n nodes and suffix links of its internal nodes, we ask the question “Is τ a suffix tree?”, i.e., is there a string S whose suffix tree has the same topological structure as τ? We place no restrictions on S, in part......In this paper we study the structure of suffix trees. Given an unlabeled tree τ on n nodes and suffix links of its internal nodes, we ask the question “Is τ a suffix tree?”, i.e., is there a string S whose suffix tree has the same topological structure as τ? We place no restrictions on S......, in particular we do not require that S ends with a unique symbol. This corresponds to considering the more general definition of implicit or extended suffix trees. Such general suffix trees have many applications and are for example needed to allow efficient updates when suffix trees are built online. Deciding...
BEAST: Bayesian evolutionary analysis by sampling trees
Directory of Open Access Journals (Sweden)
Drummond Alexei J
2007-11-01
Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.
Fringe trees, Crump-Mode-Jagers branching processes and $m$-ary search trees
Holmgren, Cecilia; Janson, Svante
2016-01-01
This survey studies asymptotics of random fringe trees and extended fringe trees in random trees that can be constructed as family trees of a Crump-Mode-Jagers branching process, stopped at a suitable time. This includes random recursive trees, preferential attachment trees, fragmentation trees, binary search trees and (more generally) $m$-ary search trees, as well as some other classes of random trees. We begin with general results, mainly due to Aldous (1991) and Jagers and Nerman (1984). T...
TREE SELECTING AND TREE RING MEASURING IN DENDROCHRONOLOGICAL INVESTIGATIONS
Directory of Open Access Journals (Sweden)
Sefa Akbulut
2004-04-01
Full Text Available Dendrochronology is a method of dating which makes use of the annual nature of tree growth. Dendrochronology may be divided into a number of subfields, each of which covers one or more aspects of the use of tree ring data: dendroclimatology, dendrogeomorphology, dendrohydrology, dendroecology, dendroarchaelogy, and dendrogylaciology. Basic of all form the analysis of the tree rings. The wood or tree rings can aid to dating past events about climatology, ecology, geology, hydrology. Dendrochronological studies are conducted either on increment cores or on discs. It may be seen abnormalities on tree rings during the measurement like that false rings, missing rings, reaction wood. Like that situation, increment cores must be extracted from four different sides of each tree and be studied as more as on tree.
tropiTree: an NGS-based EST-SSR resource for 24 tropical tree species.
Russell, Joanne R; Hedley, Peter E; Cardle, Linda; Dancey, Siobhan; Morris, Jenny; Booth, Allan; Odee, David; Mwaura, Lucy; Omondi, William; Angaine, Peter; Machua, Joseph; Muchugi, Alice; Milne, Iain; Kindt, Roeland; Jamnadass, Ramni; Dawson, Ian K
2014-01-01
The development of genetic tools for non-model organisms has been hampered by cost, but advances in next-generation sequencing (NGS) have created new opportunities. In ecological research, this raises the prospect for developing molecular markers to simultaneously study important genetic processes such as gene flow in multiple non-model plant species within complex natural and anthropogenic landscapes. Here, we report the use of bar-coded multiplexed paired-end Illumina NGS for the de novo development of expressed sequence tag-derived simple sequence repeat (EST-SSR) markers at low cost for a range of 24 tree species. Each chosen tree species is important in complex tropical agroforestry systems where little is currently known about many genetic processes. An average of more than 5,000 EST-SSRs was identified for each of the 24 sequenced species, whereas prior to analysis 20 of the species had fewer than 100 nucleotide sequence citations. To make results available to potential users in a suitable format, we have developed an open-access, interactive online database, tropiTree (http://bioinf.hutton.ac.uk/tropiTree), which has a range of visualisation and search facilities, and which is a model for the efficient presentation and application of NGS data.
Generating code adapted for interlinking legacy scalar code and extended vector code
Gschwind, Michael K
2013-06-04
Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.
Energy Technology Data Exchange (ETDEWEB)
Visser, B. [Stork Product Eng., Amsterdam (Netherlands)
1996-09-01
To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)
DEFF Research Database (Denmark)
Steensig, Jakob; Heinemann, Trine
2015-01-01
We welcome Tanya Stivers’s discussion (Stivers, 2015/this issue) of coding social interaction and find that her descriptions of the processes of coding open up important avenues for discussion, among other things of the precise ad hoc considerations that researchers need to bear in mind, both when....... Instead we propose that the promise of coding-based research lies in its ability to open up new qualitative questions....
Shapiro, Wilbur
1996-01-01
This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.
Two-Dimensional Tail-Biting Convolutional Codes
Alfandary, Liam
2011-01-01
The multidimensional convolutional codes are an extension of the notion of convolutional codes (CCs) to several dimensions of time. This paper explores the class of two-dimensional convolutional codes (2D CCs) and 2D tail-biting convolutional codes (2D TBCCs), in particular, from several aspects. First, we derive several basic algebraic properties of these codes, applying algebraic methods in order to find bijective encoders, create parity check matrices and to inverse encoders. Next, we discuss the minimum distance and weight distribution properties of these codes. Extending an existing tree-search algorithm to two dimensions, we apply it to find codes with high minimum distance. Word-error probability asymptotes for sample codes are given and compared with other codes. The results of this approach suggest that 2D TBCCs can perform better than comparable 1D TBCCs or other codes. We then present several novel iterative suboptimal algorithms for soft decoding 2D CCs, which are based on belief propagation. Two ...
Rate of tree carbon accumulation increases continuously with tree size.
Stephenson, N L; Das, A J; Condit, R; Russo, S E; Baker, P J; Beckman, N G; Coomes, D A; Lines, E R; Morris, W K; Rüger, N; Alvarez, E; Blundo, C; Bunyavejchewin, S; Chuyong, G; Davies, S J; Duque, A; Ewango, C N; Flores, O; Franklin, J F; Grau, H R; Hao, Z; Harmon, M E; Hubbell, S P; Kenfack, D; Lin, Y; Makana, J-R; Malizia, A; Malizia, L R; Pabst, R J; Pongpattananurak, N; Su, S-H; Sun, I-F; Tan, S; Thomas, D; van Mantgem, P J; Wang, X; Wiser, S K; Zavala, M A
2014-03-06
Forests are major components of the global carbon cycle, providing substantial feedback to atmospheric greenhouse gas concentrations. Our ability to understand and predict changes in the forest carbon cycle--particularly net primary productivity and carbon storage--increasingly relies on models that represent biological processes across several scales of biological organization, from tree leaves to forest stands. Yet, despite advances in our understanding of productivity at the scales of leaves and stands, no consensus exists about the nature of productivity at the scale of the individual tree, in part because we lack a broad empirical assessment of whether rates of absolute tree mass growth (and thus carbon accumulation) decrease, remain constant, or increase as trees increase in size and age. Here we present a global analysis of 403 tropical and temperate tree species, showing that for most species mass growth rate increases continuously with tree size. Thus, large, old trees do not act simply as senescent carbon reservoirs but actively fix large amounts of carbon compared to smaller trees; at the extreme, a single big tree can add the same amount of carbon to the forest within a year as is contained in an entire mid-sized tree. The apparent paradoxes of individual tree growth increasing with tree size despite declining leaf-level and stand-level productivity can be explained, respectively, by increases in a tree's total leaf area that outpace declines in productivity per unit of leaf area and, among other factors, age-related reductions in population density. Our results resolve conflicting assumptions about the nature of tree growth, inform efforts to undertand and model forest carbon dynamics, and have additional implications for theories of resource allocation and plant senescence.
National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...
Waters, Joe
2012-01-01
Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown
Energy Technology Data Exchange (ETDEWEB)
Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.
1985-03-01
The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
Institute of Scientific and Technical Information of China (English)
罗俊; 张国平
2012-01-01
Aiming at the encryption system with lower security requirements, this paper puts forward a symmetrical key automatic changing cryptography scheme based on simple hybrid selection coding, with the combination of unilateral canonical Huffman coding and fixed-length coding. The statistical results of plaintext are used as their own encrypted key and coding basis, which makes the scheme to be easy to implement and calculate and of cost low. It is proved that when the keys are completely unknown, the cracking of the encryption system has great difficulty.%针对安全性要求不太高的加密系统,将单边范式Huffman编码与等长编码相结合,提出一种基于混合选择编码的对称密钥自变动加密方案.通过将明文的统计结果作为自身加密的密钥和编码依据,使方案易于实现,且计算存储成本低.理论分析结果证明,在密钥完全未知的情况下破解该加密体制难度较大.
Butler, Ricky W.; Martensen, Anna L.
1992-01-01
FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.
Zhu, Zhen; Puliga, Michelangelo; Cerina, Federica; Chessa, Alessandro; Riccaboni, Massimo
2015-01-01
The fragmentation of production across countries has become an important feature of the globalization in recent decades and is often conceptualized by the term “global value chains” (GVCs). When empirically investigating the GVCs, previous studies are mainly interested in knowing how global the GVCs are rather than how the GVCs look like. From a complex networks perspective, we use the World Input-Output Database (WIOD) to study the evolution of the global production system. We find that the industry-level GVCs are indeed not chain-like but are better characterized by the tree topology. Hence, we compute the global value trees (GVTs) for all the industries available in the WIOD. Moreover, we compute an industry importance measure based on the GVTs and compare it with other network centrality measures. Finally, we discuss some future applications of the GVTs. PMID:25978067
Pushdown machines for the macro tree transducer
Engelfriet, Joost; Vogler, Heiko
1986-01-01
The macro tree transducer can be considered as a system of recursive function procedures with parameters, where the recursion is on a tree (e.g., the syntax tree of a program). We investigate characterizations of the class of tree (tree-to-string) translations which is induced by macro tree transduc
Tree Rings: Timekeepers of the Past.
Phipps, R. L.; McGowan, J.
One of a series of general interest publications on science issues, this booklet describes the uses of tree rings in historical and biological recordkeeping. Separate sections cover the following topics: dating of tree rings, dating with tree rings, tree ring formation, tree ring identification, sample collections, tree ring cross dating, tree…
Pushdown machines for the macro tree transducer
Engelfriet, Joost; Vogler, Heiko
1986-01-01
The macro tree transducer can be considered as a system of recursive function procedures with parameters, where the recursion is on a tree (e.g., the syntax tree of a program). We investigate characterizations of the class of tree (tree-to-string) translations which is induced by macro tree
Directory of Open Access Journals (Sweden)
Rob Garbutt
2013-10-01
Full Text Available Our paper focuses on the materiality, cultural history and cultural relations of selected artworks in the exhibition Wood for the trees (Lismore Regional Gallery, New South Wales, Australia, 10 June – 17 July 2011. The title of the exhibition, intentionally misreading the aphorism “Can’t see the wood for the trees”, by reading the wood for the resource rather than the collective wood[s], implies conservation, preservation, and the need for sustaining the originating resource. These ideas have particular resonance on the NSW far north coast, a region once rich in rainforest. While the Indigenous population had sustainable practices of forest and land management, the colonists deployed felling and harvesting in order to convert the value of the local, abundant rainforest trees into high-value timber. By the late twentieth century, however, a new wave of settlers launched a protest movements against the proposed logging of remnant rainforest at Terania Creek and elsewhere in the region. Wood for the trees, curated by Gallery Director Brett Adlington, plays on this dynamic relationship between wood, trees and people. We discuss the way selected artworks give expression to the themes or concepts of productive labour, nature and culture, conservation and sustainability, and memory. The artworks include Watjinbuy Marrawilil’s (1980 Carved ancestral figure ceremonial pole, Elizabeth Stops’ (2009/10 Explorations into colonisation, Hossein Valamanesh’s (2008 Memory stick, and AñA Wojak’s (2008 Unread book (in a forgotten language. Our art writing on the works, a practice informed by Bal (2002, Muecke (2008 and Papastergiadis (2004, becomes a conversation between the works and the themes or concepts. As a form of material excess of the most productive kind (Grosz, 2008, p. 7, art seeds a response to that which is in the air waiting to be said of the past, present and future.
Implementation of Huffman Decoder on Fpga
Safia Amir Dahri; Dr Abdul Fattah Chandio
2016-01-01
Lossless data compression algorithm is most widely used algorithm in data transmission, reception and storage systems in order to increase data rate, speed and save lots of space on storage devices. Now-a-days, different algorithms are implemented in hardware to achieve benefits of hardware realizations. Hardware implementation of algorithms, digital signal processing algorithms and filter realization is done on programmable devices i.e. FPGA. In lossless data compression algorith...
Grünewald, Stefan
2010-01-01
A classical problem in phylogenetic tree analysis is to decide whether there is a phylogenetic tree $T$ that contains all information of a given collection $\\cP$ of phylogenetic trees. If the answer is "yes" we say that $\\cP$ is compatible and $T$ displays $\\cP$. This decision problem is NP-complete even if all input trees are quartets, that is binary trees with exactly four leaves. In this paper, we prove a sufficient condition for a set of binary phylogenetic trees to be compatible. That result is used to give a short and self-contained proof of the known characterization of quartet sets of minimal cardinality which are displayed by a unique phylogenetic tree.
Making CSB + -Trees Processor Conscious
DEFF Research Database (Denmark)
Samuel, Michael; Pedersen, Anders Uhl; Bonnet, Philippe
2005-01-01
Cache-conscious indexes, such as CSB+-tree, are sensitive to the underlying processor architecture. In this paper, we focus on how to adapt the CSB+-tree so that it performs well on a range of different processor architectures. Previous work has focused on the impact of node size on the performance...... of the CSB+-tree. We argue that it is necessary to consider a larger group of parameters in order to adapt CSB+-tree to processor architectures as different as Pentium and Itanium. We identify this group of parameters and study how it impacts the performance of CSB+-tree on Itanium 2. Finally, we propose...... a systematic method for adapting CSB+-tree to new platforms. This work is a first step towards integrating CSB+-tree in MySQL’s heap storage manager....
Forrow, Aden; Woodhouse, Francis G.; Dunkel, Jörn
2016-11-01
Coherent, large scale dynamics in many nonequilibrium physical, biological, or information transport networks are driven by small-scale local energy input. We introduce and explore a generic model for compressible active flows on tree networks. In contrast to thermally-driven systems, active friction selects discrete states with only a small number of oscillation modes activated at distinct fixed amplitudes. This state selection can interact with graph topology to produce different localized dynamical time scales in separate regions of large networks. Using perturbation theory, we systematically predict the stationary states of noisy networks. Our analytical predictions agree well with a Bayesian state estimation based on a hidden Markov model applied to simulated time series data on binary trees. While the number of stable states per tree scales exponentially with the number of edges, the mean number of activated modes in each state averages 1 / 4 the number of edges. More broadly, these results suggest that the macroscopic response of active networks, from actin-myosin networks in cells to flow networks in Physarum polycephalum, can be dominated by a few select modes.
Research on universal combinatorial coding.
Lu, Jun; Zhang, Zhuo; Mo, Juan
2014-01-01
The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value.
Gene tree correction for reconciliation and species tree inference
Directory of Open Access Journals (Sweden)
Swenson Krister M
2012-11-01
Full Text Available Abstract Background Reconciliation is the commonly used method for inferring the evolutionary scenario for a gene family. It consists in “embedding” inferred gene trees into a known species tree, revealing the evolution of the gene family by duplications and losses. When a species tree is not known, a natural algorithmic problem is to infer a species tree from a set of gene trees, such that the corresponding reconciliation minimizes the number of duplications and/or losses. The main drawback of reconciliation is that the inferred evolutionary scenario is strongly dependent on the considered gene trees, as few misplaced leaves may lead to a completely different history, with significantly more duplications and losses. Results In this paper, we take advantage of certain gene trees’ properties in order to preprocess them for reconciliation or species tree inference. We flag certain duplication vertices of a gene tree, the “non-apparent duplication” (NAD vertices, as resulting from the misplacement of leaves. In the case of species tree inference, we develop a polynomial-time heuristic for removing the minimum number of species leading to a set of gene trees that exhibit no NAD vertices with respect to at least one species tree. In the case of reconciliation, we consider the optimization problem of removing the minimum number of leaves or species leading to a tree without any NAD vertex. We develop a polynomial-time algorithm that is exact for two special classes of gene trees, and show a good performance on simulated data sets in the general case.
Rate of tree carbon accumulation increases continuously with tree size
Stephenson, N.L.; Das, A.J.; Condit, R.; Russo, S.E.; Baker, P.J.; Beckman, N.G.; Coomes, D.A.; Lines, E.R.; Morris, W.K.; Rüger, N.; Álvarez, E.; Blundo, C.; Bunyavejchewin, S.; Chuyong, G.; Davies, S.J.; Duque, Á.; Ewango, C.N.; Flores, O.; Franklin, J.F.; Grau, H.R.; Hao, Z.; Harmon, M.E.; Hubbell, S.P.; Kenfack, D.; Lin, Y.; Makana, J.-R.; Malizia, A.; Malizia, L.R.; Pabst, R.J.; Pongpattananurak, N.; Su, S.-H.; Sun, I-F.; Tan, S.; Thomas, D.; van Mantgem, P.J.; Wang, X.; Wiser, S.K.; Zavala, M.A.
2014-01-01
Forests are major components of the global carbon cycle, providing substantial feedback to atmospheric greenhouse gas concentrations. Our ability to understand and predict changes in the forest carbon cycle—particularly net primary productivity and carbon storage - increasingly relies on models that represent biological processes across several scales of biological organization, from tree leaves to forest stands. Yet, despite advances in our understanding of productivity at the scales of leaves and stands, no consensus exists about the nature of productivity at the scale of the individual tree, in part because we lack a broad empirical assessment of whether rates of absolute tree mass growth (and thus carbon accumulation) decrease, remain constant, or increase as trees increase in size and age. Here we present a global analysis of 403 tropical and temperate tree species, showing that for most species mass growth rate increases continuously with tree size. Thus, large, old trees do not act simply as senescent carbon reservoirs but actively fix large amounts of carbon compared to smaller trees; at the extreme, a single big tree can add the same amount of carbon to the forest within a year as is contained in an entire mid-sized tree. The apparent paradoxes of individual tree growth increasing with tree size despite declining leaf-level and stand-level productivity can be explained, respectively, by increases in a tree’s total leaf area that outpace declines in productivity per unit of leaf area and, among other factors, age-related reductions in population density. Our results resolve conflicting assumptions about the nature of tree growth, inform efforts to understand and model forest carbon dynamics, and have additional implications for theories of resource allocation and plant senescence.
SC Secretariat
2005-01-01
Please note that the Safety Code A12 (Code A12) entitled "THE SAFETY COMMISSION (SC)" is available on the web at the following url: https://edms.cern.ch/document/479423/LAST_RELEASED Paper copies can also be obtained from the SC Unit Secretariat, e-mail: sc.secretariat@cern.ch SC Secretariat
Million, June
2004-01-01
In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…
Thieren, Michel; Mauron, Alex
2007-01-01
This month marks sixty years since the Nuremberg code – the basic text of modern medical ethics – was issued. The principles in this code were articulated in the context of the Nuremberg trials in 1947. We would like to use this anniversary to examine its ability to address the ethical challenges of our time.
Pseudonoise code tracking loop
Laflame, D. T. (Inventor)
1980-01-01
A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.
DEFF Research Database (Denmark)
Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente
2013-01-01
is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...
DEFF Research Database (Denmark)
Pries-Heje, Jan; Pries-Heje, Lene; Dahlgaard, Bente
2013-01-01
is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...
BIALEK, W; RIEKE, F; VANSTEVENINCK, RRD; WARLAND, D
1991-01-01
Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task - extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from
DEFF Research Database (Denmark)
Soon, Winnie
2014-01-01
, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...
Transformation invariant sparse coding
DEFF Research Database (Denmark)
Mørup, Morten; Schmidt, Mikkel Nørgaard
2011-01-01
Sparse coding is a well established principle for unsupervised learning. Traditionally, features are extracted in sparse coding in specific locations, however, often we would prefer invariant representation. This paper introduces a general transformation invariant sparse coding (TISC) model....... The model decomposes images into features invariant to location and general transformation by a set of specified operators as well as a sparse coding matrix indicating where and to what degree in the original image these features are present. The TISC model is in general overcomplete and we therefore invoke...... sparse coding to estimate its parameters. We demonstrate how the model can correctly identify components of non-trivial artificial as well as real image data. Thus, the model is capable of reducing feature redundancies in terms of pre-specified transformations improving the component identification....
1983-01-01
The specification of Software Implemented Fault Tolerance (SIFT) consists of two parts, the specifications of the SIFT models and the specifications of the SIFT PASCAL program which actually implements the SIFT system. The code specifications are the last of a hierarchy of models describing the operation of the SIFT system and are related to the SIFT models as well as the PASCAL program. These Specifications serve to link the SIFT models to the running program. The specifications are very large and detailed and closely follow the form and organization of the PASCAL code. In addition to describing each of the components of the SIFT code, the code specifications describe the assumptions of the upper SIFT models which are required to actually prove that the code will work as specified. These constraints are imposed primarily on the schedule tables.
DEFF Research Database (Denmark)
Andersen, Christian Ulrik
2007-01-01
discusses code as the artist’s material and, further, formulates a critique of Cramer. The seductive magic in computer-generated art does not lie in the magical expression, but nor does it lie in the code/material/text itself. It lies in the nature of code to do something – as if it was magic......Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...
Combustion chamber analysis code
Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.
1993-05-01
A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.
Astrophysics Source Code Library
Allen, Alice; Berriman, Bruce; Hanisch, Robert J; Mink, Jessica; Teuben, Peter J
2012-01-01
The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.
Institute of Scientific and Technical Information of China (English)
田巍威; 高跃东; 郭彦; 黄京飞; 肖昌; 李作生; 张华堂
2012-01-01
目的 树鼩可作为多种人类疾病研究的良好模型,但其免疫系统各类细胞表面标志、功能以及在疾病发生、发展过程中的作用和意义尚无系统研究.研究以获得树鼩调节性T细胞(Tr)相关分子CD127为目的,以分析其分子特征.方法 提取树鼩外周血总RNA反转录,经巢式PCR扩增获得目的片段,进而以Discovery Studio等生物软件进行分析.结果 扩增得到全长为1 392 bp的树鼩CD127编码序列,确定了现有数据库中缺失的一处未知片段.树鼩CD127与人、黑猩猩亲缘关系较近,其氨基酸序列具有较高保守性,蛋白质三维结构整体与人相似,但N糖基化位点数目以及电荷分布存在差异.结论 所得序列能够编码具有正常功能的蛋白,全长序列为后续单克隆抗体制备奠定了基础,有助于鉴定树鼩Treg细胞以及相关疾病机理的研究.%Objective Tree shrews is a good human diseases models.The cloning of its immune cellular markers is necessary.This study is to obtain Tree shrews CD127 for analyzing its molecular characteristics.Methods The CD127 gene was amplified from total RNA of tree shrews peripheral blood by RT-nested PCR and its biological characteristics were analyzed by biology software such as Discovery Studio.Results The fulllength CD127 cDNA encoding 1392bp of Tree shrews was cloned and one unknown fragment of tree shrews CD127 cDNA in the database was defined.Tree shrews CD127 amino acid sequence was conserved,and showed a close genetic relationship with Homo sapiens and Pan troglodytes.The structure of Tree shrews CD127 showed a similarity with human CD127,but some differences in the N-glyasylation sites number and the charge distribution have been detected.Conclusions The full-length sequence obtained in this study encodes Tree shrews CD127 which would function as its counterpart of the primate.It is useful for the preparation of antibody and studies of the role of Tree shrews Tr in human diseases.
GRFT – Genetic records family tree web applet
Directory of Open Access Journals (Sweden)
Samuel ePimentel
2011-03-01
Full Text Available Current software for storing and displaying records of genetic crosses does not provide an easy way to determine the lineage of an individual. The genetic records family tree (GRFT applet processes records of genetic crosses and allows researchers to quickly visualize lineages using a family tree construct and to access other information from these records using any Internet browser. Users select from three display features: 1 a family tree view which displays a color-coded family tree for an individual, 2 a sequential list of crosses, and 3 a list of crosses matching user-defined search criteria. Each feature contains options to specify the number of records shown and the latter two contain an option to filter results by the owner of the cross. The family tree feature is interactive, displaying a popup box with genetic information when the user mouses over an individual and allowing the user to draw a new tree by clicking on any individual in the current tree. The applet is written in Javascript and reads genetic records from a tab-delimited text file on the server, so it is cross-platform, can be accessed by anyone with an Internet connection, and supports almost instantaneous generation of new trees and table lists. Researchers can use the tool with their own genetic cross records for any sexually-reproducing organism. No additional software is required and with only minor modifications to the script, researchers can add their own custom columns. GRFT's speed, versatility, and low overhead make it an effective and innovative visualization method for genetic records. A sample tool is available at http://stanford.edu/~walbot/grft-sample.html.
DEFF Research Database (Denmark)
2015-01-01
Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity in the net...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof.......Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...
Energy Technology Data Exchange (ETDEWEB)
Nelson, R.N. (ed.)
1985-05-01
This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.
Jonge, M. de
2002-01-01
Dividing software systems in components improves software reusability as well as software maintainability. Components live at several levels, we concentrate on the implementation level where components are formed by source files, divided over directory structures. Such source code components are usu
Application of RS Codes in Decoding QR Code
Institute of Scientific and Technical Information of China (English)
Zhu Suxia(朱素霞); Ji Zhenzhou; Cao Zhiyan
2003-01-01
The QR Code is a 2-dimensional matrix code with high error correction capability. It employs RS codes to generate error correction codewords in encoding and recover errors and damages in decoding. This paper presents several QR Code's virtues, analyzes RS decoding algorithm and gives a software flow chart of decoding the QR Code with RS decoding algorithm.
Evaluation Codes from an Affine Veriety Code Perspective
DEFF Research Database (Denmark)
Geil, Hans Olav
2008-01-01
Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...
Genome sequence and genetic diversity of European ash trees
DEFF Research Database (Denmark)
Sollars, Elizabeth S A; Harper, Andrea L; Kelly, Laura J;
2016-01-01
Ash trees (genus Fraxinus, family Oleaceae) are widespread throughout the Northern Hemisphere, but are being devastated in Europe by the fungus Hymenoscyphus fraxineus, causing ash dieback, and in North America by the herbivorous beetle Agrilus planipennis. Here we sequence the genome of a low......-heterozygosity Fraxinus excelsior tree from Gloucestershire, UK, annotating 38,852 protein-coding genes of which 25% appear ash specific when compared with the genomes of ten other plant species. Analyses of paralogous genes suggest a whole-genome duplication shared with olive (Olea europaea, Oleaceae). We also re......-sequence 37 F. excelsior trees from Europe, finding evidence for apparent long-term decline in effective population size. Using our reference sequence, we re-analyse association transcriptomic data, yielding improved markers for reduced susceptibility to ash dieback. Surveys of these markers in British...
Severe accident analysis using dynamic accident progression event trees
Hakobyan, Aram P.
In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a
African Journals Online (AJOL)
Philippe
The purpose of this paper is to reveal that physical punishment is still recognized ... reasonably believes that force is necessary to discipline the child to ..... the Rwandan society which argues that “igiti kigororwa kikiri gito” (meaning that a tree.
Distributed multiple description coding
Bai, Huihui; Zhao, Yao
2011-01-01
This book examines distributed video coding (DVC) and multiple description coding (MDC), two novel techniques designed to address the problems of conventional image and video compression coding. Covering all fundamental concepts and core technologies, the chapters can also be read as independent and self-sufficient, describing each methodology in sufficient detail to enable readers to repeat the corresponding experiments easily. Topics and features: provides a broad overview of DVC and MDC, from the basic principles to the latest research; covers sub-sampling based MDC, quantization based MDC,
2014-01-01
While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.
Li, Songze; Maddah-Ali, Mohammad Ali; Avestimehr, A. Salman
2015-01-01
MapReduce is a commonly used framework for executing data-intensive jobs on distributed server clusters. We introduce a variant implementation of MapReduce, namely "Coded MapReduce", to substantially reduce the inter-server communication load for the shuffling phase of MapReduce, and thus accelerating its execution. The proposed Coded MapReduce exploits the repetitive mapping of data blocks at different servers to create coding opportunities in the shuffling phase to exchange (key,value) pair...
Brehm, Enrico M
2016-01-01
In this work, we introduce classical holographic codes. These can be understood as concatenated probabilistic codes and can be represented as networks uniformly covering hyperbolic space. In particular, classical holographic codes can be interpreted as maps from bulk degrees of freedom to boundary degrees of freedom. Interestingly, they are shown to exhibit features similar to those expected from the AdS/CFT correspondence. Among these are a version of the Ryu-Takayanagi formula and intriguing properties regarding bulk reconstruction and boundary representations of bulk operations. We discuss the relation of our findings with expectations from AdS/CFT and, in particular, with recent results from quantum error correction.
Casado Arroyo, Carlos
2016-01-01
“Tree plastic bark" consiste en la realización de una intervención artística en un entorno natural concreto, generando de esta manera un Site Specific(1). Como hace alusión Rosalind Krauss en sus reflexiones “La escultura en el campo expandido”(2), comenta que su origen esta claramente ligado con el concepto de monumentalidad. La escultura es un monumento, se crea para conmemorar algún hecho o personaje relevante y está realizada para una ubicación concreta. La investigación parte de la id...
Williams, Kathryn R.
1999-10-01
Starting in September 1925, JCE reproduced pictures of famous chemists or chemistry-related works of art as frontispieces. Often, the Journal included a biography or other article about the picture. The August 1945 frontispiece featured the largest cork oak in the United States. An accompanying article described the goals of the Cork Project to plant cork trees in suitable locations in the U.S., to compensate for uncertain European and African sources during World War II. The final frontispiece appeared in December 1956. To view supplementary material, please refer to JCE Online's supplementary links.
SOLVING MINIMUM SPANNING TREE PROBLEM WITH DNA COMPUTING
Institute of Scientific and Technical Information of China (English)
Liu Xikui; Li Yan; Xu Jin
2005-01-01
Molecular programming is applied to minimum spanning problem whose solution requires encoding of real values in DNA strands. A new encoding scheme is proposed for real values that is biologically plausible and has a fixed code length. According to the characteristics of the problem, a DNA algorithm solving the minimum spanning tree problem is given. The effectiveness of the proposed method is verified by simulation. The advantages and disadvantages of this algorithm are discussed.
New Explorations for Decision Trees
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Traditionally, the decision tree method is defined and used for finding the optimal solution of a Bayesian decision problem. And it is difficult to use the decision tree method to find the sub-optimal solution, not to mention to rank alternatives. This paper discusses how to use the decision tree method for the alternative selecting and ranking.A practical case study is given to illustrate the applicability.
Hu, J H; Wang, Y; Cahill, P T
1997-01-01
This paper reports a multispectral code excited linear prediction (MCELP) method for the compression of multispectral images. Different linear prediction models and adaptation schemes have been compared. The method that uses a forward adaptive autoregressive (AR) model has been proven to achieve a good compromise between performance, complexity, and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over nonoverlapping three-dimensional (3-D) macroblocks. Each macroblock is further divided into several 3-D micro-blocks, and the best excitation signal for each microblock is determined through an analysis-by-synthesis procedure. The MFCELP method has been applied to multispectral magnetic resonance (MR) images. To satisfy the high quality requirement for medical images, the error between the original image set and the synthesized one is further specified using a vector quantizer. This method has been applied to images from 26 clinical MR neuro studies (20 slices/study, three spectral bands/slice, 256x256 pixels/band, 12 b/pixel). The MFCELP method provides a significant visual improvement over the discrete cosine transform (DCT) based Joint Photographers Expert Group (JPEG) method, the wavelet transform based embedded zero-tree wavelet (EZW) coding method, and the vector tree (VT) coding method, as well as the multispectral segmented autoregressive moving average (MSARMA) method we developed previously.
Tree wavelet approximations with applications
Institute of Scientific and Technical Information of China (English)
XU Yuesheng; ZOU Qingsong
2005-01-01
We construct a tree wavelet approximation by using a constructive greedy scheme(CGS). We define a function class which contains the functions whose piecewise polynomial approximations generated by the CGS have a prescribed global convergence rate and establish embedding properties of this class. We provide sufficient conditions on a tree index set and on bi-orthogonal wavelet bases which ensure optimal order of convergence for the wavelet approximations encoded on the tree index set using the bi-orthogonal wavelet bases. We then show that if we use the tree index set associated with the partition generated by the CGS to encode a wavelet approximation, it gives optimal order of convergence.
Phylogenetic trees and Euclidean embeddings.
Layer, Mark; Rhodes, John A
2017-01-01
It was recently observed by de Vienne et al. (Syst Biol 60(6):826-832, 2011) that a simple square root transformation of distances between taxa on a phylogenetic tree allowed for an embedding of the taxa into Euclidean space. While the justification for this was based on a diffusion model of continuous character evolution along the tree, here we give a direct and elementary explanation for it that provides substantial additional insight. We use this embedding to reinterpret the differences between the NJ and BIONJ tree building algorithms, providing one illustration of how this embedding reflects tree structures in data.
Institute of Scientific and Technical Information of China (English)
王燕文
2007-01-01
Once there was a well-known hill here. There were many lush trees, beautiful flowers and green grasses on it. One day, the hill said to the trees proudly, “Look, how beautiful I am! But you look so ugly on my back. It must be better if I could drive you away.” One of the trees said, “You won't have beautiful and green clothing without us trees? If you leave us, you will die away.” The hill laughed and said again,”I feel very ashamed for I am staying with you together. Sooner or later I will drive you all...
Tree felling: a necessary evil
CERN Bulletin
2013-01-01
CERN started a campaign of tree felling in 2010 for safety reasons, and it will continue this year in various parts of the Meyrin site. As in previous years, the trees cut down in 2013 will be recycled and some will be replaced. Diseased tree that had to be cut down on the Meyrin site. In association with the Geneva nature and countryside directorate (Direction générale de la nature et du paysage, DGNP), CERN commissioned the Geneva school of landscaping, engineering and architecture (Haute école du paysage, d’ingénierie et d’architecture, HEPIA) to compile an inventory of the trees on the Meyrin site. In total, 1285 trees (excluding poplars) were recorded. 75.5% of these trees were declared to be in a good state of health (i.e. 971 trees), 21.5% in a moderate state of health (276 trees) and 3% in a poor state of health (38 trees). As for the poplars, the 236 specimens recorded on the Meyrin site were judged to be too old, to...
Energy Technology Data Exchange (ETDEWEB)
Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)
1996-09-01
The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)