Results 11 to 20 of about 25,145 (258)

$f$-divergence Inequalities

open access: yesIEEE Transactions on Information Theory, 2016
This paper develops systematic approaches to obtain $f$-divergence inequalities, dealing with pairs of probability measures defined on arbitrary alphabets.
Sason, Igal, Verdú, Sergio
core   +2 more sources

On surrogate loss functions and $f$-divergences

open access: yesThe Annals of Statistics, 2008
The goal of binary classification is to estimate a discriminant function $\gamma$ from observations of covariate vectors and corresponding binary labels.
Jordan, Michael I.   +2 more
core   +6 more sources

$f$-Divergence Inequalities via Functional Domination [PDF]

open access: yes2016 IEEE International Conference on the Science of Electrical Engineering (ICSEE), 2016
This paper considers derivation of $f$-divergence inequalities via the approach of functional domination. Bounds on an $f$-divergence based on one or several other $f$-divergences are introduced, dealing with pairs of probability measures defined on ...
Sason, Igal, Verdú, Sergio
core   +2 more sources

A new quantum version of f-divergence [PDF]

open access: yes, 2018
This paper proposes and studies new quantum version of $f$-divergences, a class of convex functionals of a pair of probability distributions including Kullback-Leibler divergence, Rnyi-type relative entropy and so on.
A. Ebadian   +7 more
core   +2 more sources

f-Divergence constrained policy improvement [PDF]

open access: yes, 2018
To ensure stability of learning, state-of-the-art generalized policy iteration algorithms augment the policy improvement step with a trust region constraint bounding the information loss.
Belousov, Boris, Peters, Jan
core   +2 more sources

Recoverability for optimized quantum f-divergences [PDF]

open access: yesJournal of Physics A: Mathematical and Theoretical, 2021
The optimized quantum $f$-divergences form a family of distinguishability measures that includes the quantum relative entropy and the sandwiched R nyi relative quasi-entropy as special cases. In this paper, we establish physically meaningful refinements of the data-processing inequality for the optimized $f$-divergence.
Li Gao, Mark M Wilde
openaire   +3 more sources

Generalized Csiszár's f-divergence for Lipschitzian functions [PDF]

open access: yesMathematical Inequalities & Applications, 2021
We started with the generalization of the Csisz ́ar’s f -divergence. We stated and proved Jensen’s type inequality for L-Lipschitzian functions. The results for commonly used examples of f-divergences, such as the Kullbach-Leibler divergence, the Hellinger divergence, the R ́enyi divergence and χ2 -distance are derived.
Pečarić D., Pečarić J., Pokaz D.
openaire   +3 more sources

Quantum f-divergences via Nussbaum–Szkoła distributions and applications to f-divergence inequalities

open access: yesReviews in Mathematical Physics, 2023
The main result in this paper shows that the quantum [Formula: see text]-divergence of two states is equal to the classical [Formula: see text]-divergence of the corresponding Nussbaum–Szkoła distributions. This provides a general framework for studying certain properties of quantum entropic quantities using the corresponding classical entities.
Androulakis, George, John, Tiju Cherian
openaire   +2 more sources

Joint range of f-divergences [PDF]

open access: yes2010 IEEE International Symposium on Information Theory, 2010
We provide a general method for evaluation of the joint range of f-divergences for two different functions f. Via topological arguments we prove that the joint range for general distributions equals the convex hull of the joint range achieved by the distributions on a two-element set.
Harremoës, Peter, Vajda, Igor
openaire   +2 more sources

Sharp Inequalities for $f$-Divergences [PDF]

open access: yesIEEE Transactions on Information Theory, 2014
$f$-divergences are a general class of divergences between probability measures which include as special cases many commonly used divergences in probability, mathematical statistics and information theory such as Kullback-Leibler divergence, chi-squared divergence, squared Hellinger distance, total variation distance etc.
Guntuboyina, Adityanand   +2 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy