Results 11 to 20 of about 1,878,628 (322)
Estimating mutual information [PDF]
We present two classes of improved estimators for mutual information $M(X,Y)$, from samples of random points distributed according to some joint probability density $ (x,y)$. In contrast to conventional estimators based on binnings, they are based on entropy estimates from $k$-nearest neighbour distances. This means that they are data efficient (with $
Kraskov, A. +2 more
openaire +7 more sources
Lower bounds on mutual information [PDF]
We correct claims about lower bounds on mutual information (MI) between real-valued random variables made in A. Kraskov {\it et al.}, Phys. Rev. E {\bf 69}, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions.
Foster, D.V., Grassberger, P.
openaire +6 more sources
Mutual information rate and bounds for it. [PDF]
The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems.
Murilo S Baptista +5 more
doaj +7 more sources
Holographic mutual information of two disjoint spheres
We study quantum corrections to holographic mutual information for two disjoint spheres at a large separation by using the operator product expansion of the twist field.
Bin Chen +3 more
doaj +3 more sources
Mutual Information and Multi-Agent Systems
We consider the use of Shannon information theory, and its various entropic terms to aid in reaching optimal decisions that should be made in a multi-agent/Team scenario.
Ira S. Moskowitz +2 more
doaj +1 more source
LBF-MI: Limited Boolean Functions and Mutual Information to Infer a Gene Regulatory Network from Time-Series Gene Expression Data. [PDF]
Barman S +6 more
europepmc +3 more sources
Mutual information for fermionic systems
We study the behavior of the mutual information (MI) in various quadratic fermionic chains, with and without pairing terms and both with short- and long-range hoppings.
Luca Lepori +3 more
doaj +1 more source
The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mutual
Elif Tuna +4 more
doaj +1 more source
Mutual Information Neural-Estimation-Driven Constellation Shaping Design and Performance Analysis. [PDF]
Ji X, Wang Q, Qian L, Kam PY.
europepmc +3 more sources
Information Bottleneck Analysis by a Conditional Mutual Information Bound
Task-nuisance decomposition describes why the information bottleneck loss I(z;x)−βI(z;y) is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z.
Taro Tezuka, Shizuma Namekawa
doaj +1 more source

