Results 21 to 30 of about 247,902 (260)
Estimating mutual information [PDF]
We present two classes of improved estimators for mutual information $M(X,Y)$, from samples of random points distributed according to some joint probability density $ (x,y)$. In contrast to conventional estimators based on binnings, they are based on entropy estimates from $k$-nearest neighbour distances. This means that they are data efficient (with $
Kraskov, A. +2 more
openaire +4 more sources
Mutual Information and Multi-Agent Systems
We consider the use of Shannon information theory, and its various entropic terms to aid in reaching optimal decisions that should be made in a multi-agent/Team scenario.
Ira S. Moskowitz +2 more
doaj +1 more source
Mutual information for fermionic systems
We study the behavior of the mutual information (MI) in various quadratic fermionic chains, with and without pairing terms and both with short- and long-range hoppings.
Luca Lepori +3 more
doaj +1 more source
The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mutual
Elif Tuna +4 more
doaj +1 more source
Information Bottleneck Analysis by a Conditional Mutual Information Bound
Task-nuisance decomposition describes why the information bottleneck loss I(z;x)−βI(z;y) is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z.
Taro Tezuka, Shizuma Namekawa
doaj +1 more source
Achievable information rate optimization in C-band optical fiber communication system
Optical fiber communication networks play an important role in the global telecommunication network. However, nonlinear effects in the optical fiber and transceiver noise greatly limit the performance of fiber communication systems.
Zheng Liu +5 more
doaj +1 more source
An Axiomatic Characterization of Mutual Information
We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev’s characterization of the Shannon entropy.
James Fullwood
doaj +1 more source
The kernel mutual information [PDF]
We introduce a new contrast function, the kernel mutual information (KMI), to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information between a discretised ...
Gretton, A., Herbrich, R., Smola, A.
openaire +3 more sources
Hashing with Mutual Information [PDF]
Binary vector embeddings enable fast nearest neighbor retrieval in large databases of high-dimensional objects, and play an important role in many practical applications, such as image and video retrieval. We study the problem of learning binary vector embeddings under a supervised setting, also known as hashing.
Fatih Cakir +3 more
openaire +3 more sources
Mutual Information between Order Book Layers
The order book is a list of all current buy or sell orders for a given financial security. The rise of electronic stock exchanges introduced a debate about the relevance of the information it encapsulates of the activity of traders.
Daniel Libman +3 more
doaj +1 more source

