Results 221 to 230 of about 335,764 (251)
Hierarchical Summary Statistics Encoding Across Primary Visual and Posterior Parietal Cortices
This study shows that mouse V1 simultaneously encodes the ensemble mean and variance of motion, providing a robust summary‐statistic representation that persists despite single‐neuron variability. These signals propagate to PPC, where they are transformed into abstract category representations during decision making.
Young‐Beom Lee +4 more
wiley +1 more source
Some of the next articles are maybe not open access.
Related searches:
Related searches:
A Survey on Information Bottleneck
IEEE Transactions on Pattern Analysis and Machine IntelligenceThis survey is for the remembrance of one of the creators of the information bottleneck theory, Prof. Naftali Tishby, passing away at the age of 68 on August, 2021. Information bottleneck (IB), a novel information theoretic approach for pattern analysis and representation learning, has gained widespread popularity since its birth in 1999.
Shizhe Hu +3 more
openaire +4 more sources
Information Bottleneck and Aggregated Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023We consider the problem of learning a neural network classifier. Under the information bottleneck (IB) principle, we associate with this classification problem a representation learning problem, which we call "IB learning". We show that IB learning is, in fact, equivalent to a special class of the quantization problem.
Soflaei, Masoumeh +4 more
openaire +2 more sources
A Spiking Neuron as Information Bottleneck
Neural Computation, 2010Neurons receive thousands of presynaptic input spike trains while emitting a single output spike train. This drastic dimensionality reduction suggests considering a neuron as a bottleneck for information transmission. Extending recent results, we propose a simple learning rule for the weights of spiking neurons derived from the information bottleneck (
Buesing L., Maass W.
openaire +2 more sources
Information Bottleneck Problem Revisited
2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton), 2019In this paper, we revisit the information bottleneck problem whose formulation and solution are of great importance in both information theory and statistical learning applications. We go into details as to why the problem was first introduced and how the algorithm proposed using Lagrangian method to solve such problems fell short of an exact solution.
Farhang Bayat, Shuangqing Wei
openaire +1 more source
Neural Information Bottleneck Decoding
2020 14th International Conference on Signal Processing and Communication Systems (ICSPCS), 2020Receiver-sided channel decoding is a crucial, but computationally very demanding task. Recently, information-bottleneck-based decoding received considerable attention in the literature, as it achieves very good performance with coarse quantization and low complexity.
Stark, Maximilian +2 more
openaire +1 more source
Distributed cooperative information bottleneck
2017 IEEE International Symposium on Information Theory (ISIT), 2017This paper investigates a scenario where two distant nodes separately observe memoryless process, namely X 1 and X 2 , and can cooperate through multiple exchanges of messages with the goal of enabling a third node to learn “relevant information” (measured in terms of a multi-letter mutual information) about some hidden memoryless process Y, which is ...
Vera, Matias +2 more
openaire +1 more source
The two-way cooperative Information Bottleneck
2015 IEEE International Symposium on Information Theory (ISIT), 2015The two-way Information Bottleneck problem, where two nodes exchange information iteratively about two arbitrarily dependent memoryless sources, is considered. Based on the observations and the information exchange, each node is required to extract “relevant information”, measured in terms of the normalized mutual information, from two arbitrarily ...
Matias, Vera +2 more
openaire +2 more sources
Contrastive Learning via Variational Information Bottleneck
IEEE Transactions on Pattern Analysis and Machine IntelligenceRecent advances in self-supervised learning have witnessed great achievements, especially with the introduction of contrastive learning, where the goal is to maximize the mutual information between different augmentations of the same image, i.e., positive pairs. However, such optimization does not necessarily correspond to optimal representation due to
Jin Li +7 more
openaire +2 more sources

