Results 1 to 10 of about 1,878,628 (322)

Generalized Mutual Information [PDF]

open access: yesStats, 2020
Mutual information is one of the essential building blocks of information theory. It is however only finitely defined for distributions in a subclass of the general class of all distributions on a joint alphabet.
Zhiyi Zhang
doaj   +3 more sources

Pathway analysis through mutual information. [PDF]

open access: yesBioinformatics, 2022
AbstractPathway analysis comes in many forms. Most are seeking to establish a connection between the activity of a certain biological pathway and a difference in phenotype, often relying on an upstream differential expression analysis to establish the difference between case and control.
Jeuken GS, Käll L.
europepmc   +3 more sources

An Axiomatic Characterization of Mutual Information [PDF]

open access: yesEntropy, 2023
We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev’s characterization of the Shannon entropy.
James Fullwood
doaj   +2 more sources

Mutual information bounded by Fisher information

open access: yesPhysical Review Research
We derive a general upper bound to mutual information in terms of the Fisher information. The bound may be further used to derive a lower bound for the Bayesian quadratic cost. These two provide alternatives to other inequalities in the literature (e.g.,
Wojciech Górecki   +3 more
doaj   +3 more sources

Quadratic Mutual Information Feature Selection [PDF]

open access: yesEntropy, 2017
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian ...
Davor Sluga, Uroš Lotrič
doaj   +3 more sources

Mutual information maximizing quantum generative adversarial networks [PDF]

open access: yesScientific Reports
One of the most promising applications in the era of Noisy Intermediate-Scale Quantum (NISQ) computing is quantum generative adversarial networks (QGANs), which offer significant quantum advantages over classical machine learning in various domains ...
Mingyu Lee   +3 more
doaj   +2 more sources

Quantum mutual information in time

open access: yesNew Journal of Physics
While the quantum mutual information is a fundamental measure of quantum information, it is only defined for spacelike separated quantum systems. Such a limitation is not present in the theory of classical information, where the mutual information ...
Zhen Wu   +3 more
doaj   +4 more sources

Factorized mutual information maximization [PDF]

open access: yesKybernetika, 2020
We investigate the sets of joint probability distributions that maximize the average multi-information over a collection of margins. These functionals serve as proxies for maximizing the multi-information of a set of variables or the mutual information of two subsets of variables, at a lower computation and estimation complexity.
Merkh, Thomas, Montúfar, Guido
openaire   +4 more sources

Error Exponents and α-Mutual Information

open access: yesEntropy, 2021
Over the last six decades, the representation of error exponent functions for data transmission through noisy channels at rates below capacity has seen three distinct approaches: (1) Through Gallager’s E0 functions (with and without cost constraints); (2)
Sergio Verdú
doaj   +1 more source

Finite Sample Based Mutual Information

open access: yesIEEE Access, 2021
Mutual information is a popular metric in machine learning. In case of a discrete target variable and a continuous feature variable the mutual information can be calculated as a sum-integral of weighted log likelihood ratio of joint and marginal density ...
Khairan Rajab, Firuz Kamalov
doaj   +1 more source

Home - About - Disclaimer - Privacy