Results 11 to 20 of about 335,764 (251)

Nonlinear Information Bottleneck [PDF]

open access: yesEntropy (Basel), 2019
Information bottleneck (IB) is a technique for extracting information in one random variable $X$ that is relevant for predicting another random variable $Y$.
Kolchinsky A, Tracey B, Wolpert D.
europepmc   +4 more sources

Elastic Information Bottleneck

open access: yesMathematics, 2022
Information bottleneck is an information-theoretic principle of representation learning that aims to learn a maximally compressed representation that preserves as much information about labels as possible. Under this principle, two different methods have
Yuyan Ni, Yanyan Lan, Ao Liu, Zhiming Ma
doaj   +3 more sources

The Convex Information Bottleneck Lagrangian [PDF]

open access: yesEntropy, 2020
The information bottleneck (IB) problem tackles the issue of obtaining relevant compressed representations T of some random variable X for the task of predicting Y.
Borja Rodríguez Gálvez   +2 more
doaj   +5 more sources

Image-Based Ship Detection Using Deep Variational Information Bottleneck [PDF]

open access: yesSensors, 2023
Image-based ship detection is a critical function in maritime security. However, lacking high-quality training datasets makes it challenging to train a robust supervision deep learning model.
Duc-Dat Ngo   +4 more
doaj   +2 more sources

Exact and Soft Successive Refinement of the Information Bottleneck [PDF]

open access: yesEntropy, 2023
The information bottleneck (IB) framework formalises the essential requirement for efficient information processing systems to achieve an optimal balance between the complexity of their representation and the amount of information extracted about ...
Hippolyte Charvin   +2 more
doaj   +2 more sources

Pareto-Optimal Clustering with the Primal Deterministic Information Bottleneck [PDF]

open access: yesEntropy, 2022
At the heart of both lossy compression and clustering is a trade-off between the fidelity and size of the learned representation. Our goal is to map out and study the Pareto frontier that quantifies this trade-off.
Andrew K. Tan   +2 more
doaj   +2 more sources

Information Bottleneck Theory Based Exploration of Cascade Learning [PDF]

open access: yesEntropy, 2021
In solving challenging pattern recognition problems, deep neural networks have shown excellent performance by forming powerful mappings between inputs and targets, learning representations (features) and making subsequent predictions.
Xin Du   +2 more
doaj   +2 more sources

On Neural Networks Fitting, Compression, and Generalization Behavior via Information-Bottleneck-like Approaches [PDF]

open access: yesEntropy, 2023
It is well-known that a neural network learning process—along with its connections to fitting, compression, and generalization—is not yet well understood.
Zhaoyan Lyu   +2 more
doaj   +2 more sources

Theory and Application of the Information Bottleneck Method [PDF]

open access: yesEntropy
In 1999, Naftali Tishby et al [...]
Jan Lewandowsky, Gerhard Bauch
doaj   +2 more sources

Gaussian information bottleneck and the non-perturbative renormalization group [PDF]

open access: yesNew Journal of Physics, 2022
The renormalization group (RG) is a class of theoretical techniques used to explain the collective physics of interacting, many-body systems. It has been suggested that the RG formalism may be useful in finding and interpreting emergent low-dimensional ...
Adam G Kline, Stephanie E Palmer
doaj   +2 more sources

Home - About - Disclaimer - Privacy