Results 51 to 60 of about 335,764 (251)
Intuitively, one way to make classifiers more robust to their input is to have them depend less sensitively on their input. The Information Bottleneck (IB) tries to learn compressed representations of input that are still predictive.
Ian Fischer, Alexander A. Alemi
doaj +1 more source
The Mathematical Structure of Information Bottleneck Methods
Information Bottleneck-based methods use mutual information as a distortion function in order to extract relevant details about the structure of a complex system by compression. One of the approaches used to generate optimal compressed representations is
Albert E. Parker +2 more
doaj +1 more source
Information-Optimum LDPC Decoders Based on the Information Bottleneck Method
The Information Bottleneck method is a powerful and generic tool from the field of machine learning. It compresses an observation to a quantized variable while attempting to preserve the mutual information shared with a relevant random variable.
Jan Lewandowsky, Gerhard Bauch
doaj +1 more source
Adversarial Information Bottleneck
10 pages,7 figures,2 ...
Penglong Zhai, Shihua Zhang
openaire +3 more sources
Representation learning of graph-structured data is challenging because both graph structure and node features carry important information. Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features. However, GNNs are prone to adversarial attacks.
Wu, Tailin +3 more
openaire +2 more sources
Information Bottleneck: Theory and Applications in Deep Learning
The information bottleneck (IB) framework, proposed in [...]
Bernhard C. Geiger, Gernot Kubin
doaj +1 more source
Supplier Bottleneck and Information Dissemination
AbstractThis paper investigates the capacity decisions of complementary suppliers who produce different components of a final product. The suppliers solicit private forecast information from a buyer who has more precise information regarding the market as compared to the suppliers.
Meng Li, Yue Li, Yang Zhang
openaire +3 more sources
Convexity and Operational Interpretation of the Quantum Information Bottleneck Function
In classical information theory, the information bottleneck method (IBM) can be regarded as a method of lossy data compression which focusses on preserving meaningful (or relevant) information. As such it has recently gained a lot of attention, primarily
Datta, Nilanjana +2 more
core +1 more source
The Conditional Entropy Bottleneck
Much of the field of Machine Learning exhibits a prominent set of failure modes, including vulnerability to adversarial examples, poor out-of-distribution (OoD) detection, miscalibration, and willingness to memorize random labelings of datasets.
Ian Fischer
doaj +1 more source
LDAcoop: Integrating non‐linear population dynamics into the analysis of clonogenic growth in vitro
Limiting dilution assays (LDAs) quantify clonogenic growth by seeding serial dilutions of cells and scoring wells for colony formation. The fraction of negative wells is plotted against cells seeded and analyzed using the non‐linear modeling of LDAcoop.
Nikko Brix +13 more
wiley +1 more source

