Results 241 to 250 of about 2,863,027 (299)
Some of the next articles are maybe not open access.
1995
The entropy coder is the last stage in the encoding pipeline of JPEG, H.261, and MPEG. Unlike transform-based coding, where compression is lossy, entropy coding is lossless. The entropy coder consists of two main data compression/decompression units: a run-length coder (RLC) and a variable-length coder (VLC).
Vasudev Bhaskaran +1 more
openaire +1 more source
The entropy coder is the last stage in the encoding pipeline of JPEG, H.261, and MPEG. Unlike transform-based coding, where compression is lossy, entropy coding is lossless. The entropy coder consists of two main data compression/decompression units: a run-length coder (RLC) and a variable-length coder (VLC).
Vasudev Bhaskaran +1 more
openaire +1 more source
2002
A binary digit, or “bit,” b, takes one of the values b = 0 or b = 1. A single bit has the ability to convey a certain amount of information — the information corresponding to the outcome of a binary decision, or “event,” such as a coin toss. If we have N bits, then we can identify the outcomes of N binary decisions.
David S. Taubman, Michael W. Marcellin
openaire +1 more source
A binary digit, or “bit,” b, takes one of the values b = 0 or b = 1. A single bit has the ability to convey a certain amount of information — the information corresponding to the outcome of a binary decision, or “event,” such as a coin toss. If we have N bits, then we can identify the outcomes of N binary decisions.
David S. Taubman, Michael W. Marcellin
openaire +1 more source
2010
Let us consider a 2-bit quantizer that represents quantized values using the following set of quantization indexes: f0; 1; 2; 3g: Each quantization index given above is called a source symbol, or simply a symbol, and the set is called a symbol set. When applied to quantize a sequence of input samples, the quantizer produces a sequence of quantization ...
openaire +1 more source
Let us consider a 2-bit quantizer that represents quantized values using the following set of quantization indexes: f0; 1; 2; 3g: Each quantization index given above is called a source symbol, or simply a symbol, and the set is called a symbol set. When applied to quantize a sequence of input samples, the quantizer produces a sequence of quantization ...
openaire +1 more source
Improved entropy coding for component-based image coding
2011 18th IEEE International Conference on Image Processing, 2011In this paper, we improve on our previous work regarding component-based image coding, a hybrid transform-based/perceptual image coding scheme based on a decomposition of the image into structure and texture characterized by a Gaussian Markov random field.
Christian Feldmann, Jona Ballé
openaire +1 more source
Outlier-Resilient Entropy Coding
2011Many data compression systems rely on a final stage based on an entropy coder, generating short codes for the most probable symbols. Images, multispectroscopy or hyperspectroscopy are just some examples, but the space mission concept covers many other fields.
Jordi Portell +2 more
openaire +1 more source
Fuzzy concept lattice reduction using Shannon entropy and Huffman coding
J. Appl. Non Class. Logics, 2015In the last decade, formal concept analysis (FCA) in a fuzzy setting has received more attention for knowledge processing tasks in various fields. The hierarchical order visualisation of generated formal concepts is a major concern for the practical ...
Prem Kumar Singh, A. Gani
semanticscholar +1 more source
Concatenated error-correcting entropy codes and channel codes
IEEE International Conference on Communications, 2003. ICC '03., 2004We propose a general class of concatenated error-correcting entropy codes and channel codes. In this way we extend and generalize the existing body of work on iterative decoding of entropy and channel codes. Using the structure and properties of serial concatenated codes, we employ error-correcting entropy codes as the outer code, and a convolutional ...
A. Hedayat, A. Nosratinia
openaire +1 more source
Combination coding: a new entropy coding technique
Proceedings of Data Compression Conference - DCC '96, 1996Summary form only given. Entropy coding is defined to be the compression of a stream of symbols taken from a known symbol set where the probability of occurrence of any symbol from the set at any given point in the stream is constant and independent of any known occurrences of any other symbols.
openaire +1 more source
Multi-scale zerotree entropy coding
2000 IEEE International Symposium on Circuits and Systems. Emerging Technologies for the 21st Century. Proceedings (IEEE Cat No.00CH36353), 2002This paper describes the visual texture compression scheme adopted for MPEG-4 international standard. The scheme is based on the concept of multiscale zerotree wavelet entropy coding (MZTE) technique that provides different levels of scalability layer in terms of either spatial resolutions or picture quality.
I. Sodagar +3 more
openaire +1 more source

