Results 231 to 240 of about 2,863,027 (299)
Some of the next articles are maybe not open access.
Conditional Entropy Coding for Efficient Video Compression
European Conference on Computer Vision, 2020We propose a very simple and efficient video compression framework that only focuses on modeling the conditional entropy between frames. Unlike prior learning-based approaches, we reduce complexity by not performing any form of explicit transformations ...
Jerry Liu +6 more
semanticscholar +1 more source
Large-Alphabet Semi-Static Entropy Coding Via Asymmetric Numeral Systems
ACM Trans. Inf. Syst., 2020An entropy coder takes as input a sequence of symbol identifiers over some specified alphabet and represents that sequence as a bitstring using as few bits as possible, typically assuming that the elements of the sequence are independent of each other ...
Alistair Moffat, M. Petri
semanticscholar +1 more source
Entropy-constrained trellis coded quantization
[1991] Proceedings. Data Compression Conference, 1992Summary: Trellis-coded quantization is generalized to allow noiseless coding of the trellis branch reproduction symbols. An entropy-constrained trellis- coded quantization (ECTCQ) design algorithm is presented, based on the generalized Lloyd algorithm for trellis code design and the entropy- constrained vector quantization design algorithm.
Fischer, Thomas R., Wang, Min
openaire +2 more sources
Block truncation coding with entropy coding
IEEE Transactions on Communications, 1995Block truncation coding (BTC) is a simple and fast image compression algorithm which achieves a constant bit rate of 2.0 bits per pixel. The method is however suboptimal. We propose a modification of BTC in which the compression ratio is improved by coding the quantization data and the bit plane by arithmetic coding with an adaptive modelling scheme ...
P. Franti, O. Nevalainen
openaire +1 more source
Proceedings. Compression and Complexity of SEQUENCES 1997 (Cat. No.97TB100171), 2002
The paper addresses several issues involved in interleaving compressed output from multiple non-prefix codes or from a combination of prefix and non-prefix codes. The technique used throughout is decoder-synchronized encoding, in which the encoder manipulates the data stream to allow just-in-time decoding.
openaire +1 more source
The paper addresses several issues involved in interleaving compressed output from multiple non-prefix codes or from a combination of prefix and non-prefix codes. The technique used throughout is decoder-synchronized encoding, in which the encoder manipulates the data stream to allow just-in-time decoding.
openaire +1 more source
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225), 2002
Summary form only given. We present an algorithm for constructing entropy codes that allow progressive transmission. The algorithm constructs codes by forming an unbalanced tree in a similar to fashion to Huffman coding. It differs, however, in that nodes are combined in a rate-distortion sense. Because nodes are formed with both rate and distortion in
T. Verma, T. Meng
openaire +1 more source
Summary form only given. We present an algorithm for constructing entropy codes that allow progressive transmission. The algorithm constructs codes by forming an unbalanced tree in a similar to fashion to Huffman coding. It differs, however, in that nodes are combined in a rate-distortion sense. Because nodes are formed with both rate and distortion in
T. Verma, T. Meng
openaire +1 more source
Neural Computation, 1989
To determine whether a particular sensory event is a reliable predictor of reward or punishment it is necessary to know the prior probability of that event. If the variables of a sensory representation normally occur independently of each other, then it is possible to derive the prior probability of any logical function of the variables from the prior
H.B. Barlow +2 more
openaire +1 more source
To determine whether a particular sensory event is a reliable predictor of reward or punishment it is necessary to know the prior probability of that event. If the variables of a sensory representation normally occur independently of each other, then it is possible to derive the prior probability of any logical function of the variables from the prior
H.B. Barlow +2 more
openaire +1 more source
Entropy Constrained Fractal Image Coding
Fractals, 1997In this paper we present an entropy constrained fractal coding scheme. In order to get high compression rates, previous fractal coders used hierarchical coding schemes with variable range block sizes. Our scheme uses constant range block sizes, but the complexity of the fractal transformations is adapted to the image contents.
openaire +2 more sources
Entropy Preservation Under Markov Coding
Journal of Statistical Physics, 2001zbMATH Open Web Interface contents unavailable due to conflicting licenses.
openaire +2 more sources
The Unreasonable Effectiveness of Entropy Minimization in LLM Reasoning
arXiv.orgEntropy minimization (EM) trains the model to concentrate even more probability mass on its most confident outputs. We show that this simple objective alone, without any labeled data, can substantially improve large language models' (LLMs) performance on
Shivam Agarwal +5 more
semanticscholar +1 more source

