Results 11 to 20 of about 1,878,628 (322)

Estimating mutual information [PDF]

open access: yesPhysical Review E, 2004
We present two classes of improved estimators for mutual information $M(X,Y)$, from samples of random points distributed according to some joint probability density $ (x,y)$. In contrast to conventional estimators based on binnings, they are based on entropy estimates from $k$-nearest neighbour distances. This means that they are data efficient (with $
Kraskov, A.   +2 more
openaire   +7 more sources

Lower bounds on mutual information [PDF]

open access: yesPhysical Review E, 2011
We correct claims about lower bounds on mutual information (MI) between real-valued random variables made in A. Kraskov {\it et al.}, Phys. Rev. E {\bf 69}, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions.
Foster, D.V., Grassberger, P.
openaire   +6 more sources

Mutual information rate and bounds for it. [PDF]

open access: yesPLoS ONE, 2012
The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems.
Murilo S Baptista   +5 more
doaj   +7 more sources

Holographic mutual information of two disjoint spheres

open access: yesJournal of High Energy Physics, 2018
We study quantum corrections to holographic mutual information for two disjoint spheres at a large separation by using the operator product expansion of the twist field.
Bin Chen   +3 more
doaj   +3 more sources

Mutual Information and Multi-Agent Systems

open access: yesEntropy, 2022
We consider the use of Shannon information theory, and its various entropic terms to aid in reaching optimal decisions that should be made in a multi-agent/Team scenario.
Ira S. Moskowitz   +2 more
doaj   +1 more source

Mutual information for fermionic systems

open access: yesPhysical Review Research, 2022
We study the behavior of the mutual information (MI) in various quadratic fermionic chains, with and without pairing terms and both with short- and long-range hoppings.
Luca Lepori   +3 more
doaj   +1 more source

Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis

open access: yesEntropy, 2022
The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mutual
Elif Tuna   +4 more
doaj   +1 more source

Information Bottleneck Analysis by a Conditional Mutual Information Bound

open access: yesEntropy, 2021
Task-nuisance decomposition describes why the information bottleneck loss I(z;x)−βI(z;y) is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z.
Taro Tezuka, Shizuma Namekawa
doaj   +1 more source

Home - About - Disclaimer - Privacy