Results 1 to 10 of about 3,255 (129)

Symplectic Bregman Divergences [PDF]

open access: yesEntropy
We present a generalization of Bregman divergences in finite-dimensional symplectic vector spaces that we term symplectic Bregman divergences. Symplectic Bregman divergences are derived from a symplectic generalization of the Fenchel–Young inequality ...
Frank Nielsen
doaj   +5 more sources

Equivalence of Informations Characterizes Bregman Divergences [PDF]

open access: yesEntropy
Bregman divergences form a class of distance-like comparison functions which plays fundamental roles in optimization, statistics, and information theory.
Philip S. Chodrow
doaj   +4 more sources

Block-Active ADMM to Minimize NMF with Bregman Divergences [PDF]

open access: yesSensors, 2023
Over the last ten years, there has been a significant interest in employing nonnegative matrix factorization (NMF) to reduce dimensionality to enable a more efficient clustering analysis in machine learning.
Xinyao Li, Akhilesh Tyagi
doaj   +2 more sources

Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences [PDF]

open access: yesEntropy, 2022
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence.
Frank Nielsen
doaj   +2 more sources

The Bregman chord divergence [PDF]

open access: yes, 2018
Distances are fundamental primitives whose choice significantly impacts the performances of algorithms in machine learning and signal processing. However selecting the most appropriate distance for a given task is an endeavor.
A Banerjee   +25 more
core   +2 more sources

On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds [PDF]

open access: yesEntropy, 2020
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat ...
Frank Nielsen
doaj   +2 more sources

Divergences Induced by the Cumulant and Partition Functions of Exponential Families and Their Deformations Induced by Comparative Convexity [PDF]

open access: yesEntropy
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or ...
Frank Nielsen
doaj   +2 more sources

On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid [PDF]

open access: yesEntropy, 2020
The Jensen−Shannon divergence is a renown bounded symmetrization of the Kullback−Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar
Frank Nielsen
doaj   +2 more sources

Topological Data Analysis with Bregman Divergences [PDF]

open access: yes, 2016
Given a finite set in a metric space, the topological analysis generalizes hierarchical clustering using a 1-parameter family of homology groups to quantify connectivity in all dimensions.
Edelsbrunner, Herbert, Wagner, Hubert
core   +7 more sources

Adaptive Mixture Methods Based on Bregman Divergences [PDF]

open access: yesDigital Signal Processing, 2012
We investigate adaptive mixture methods that linearly combine outputs of $m$ constituent filters running in parallel to model a desired signal. We use "Bregman divergences" and obtain certain multiplicative updates to train the linear combination weights
Arenas-Garcia   +21 more
core   +6 more sources

Home - About - Disclaimer - Privacy