Results 11 to 20 of about 7,842 (187)

Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences [PDF]

open access: yesEntropy, 2022
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence.
Frank Nielsen
doaj   +2 more sources

Block-Active ADMM to Minimize NMF with Bregman Divergences [PDF]

open access: yesSensors, 2023
Over the last ten years, there has been a significant interest in employing nonnegative matrix factorization (NMF) to reduce dimensionality to enable a more efficient clustering analysis in machine learning.
Xinyao Li, Akhilesh Tyagi
doaj   +2 more sources

Update of Prior Probabilities by Minimal Divergence [PDF]

open access: yesEntropy, 2021
The present paper investigates the update of an empirical probability distribution with the results of a new set of observations. The update reproduces the new observations and interpolates using prior information.
Jan Naudts
doaj   +2 more sources

Divergence and Sufficiency for Convex Optimization

open access: yesEntropy, 2017
Logarithmic score and information divergence appear in information theory, statistics, statistical mechanics, and portfolio theory. We demonstrate that all these topics involve some kind of optimization that leads directly to regret functions and such ...
Peter Harremoës
doaj   +5 more sources

Anomaly Detection in High-Dimensional Time Series Data with Scaled Bregman Divergence [PDF]

open access: yesAlgorithms
The purpose of anomaly detection is to identify special data points or patterns that significantly deviate from the expected or typical behavior of the majority of the data, and it has a wide range of applications across various domains.
Yunge Wang   +4 more
doaj   +2 more sources

Learning to Approximate a Bregman Divergence [PDF]

open access: yes, 2020
Bregman divergences generalize measures such as the squared Euclidean distance and the KL divergence, and arise throughout many areas of machine learning.
Castanon, David   +4 more
core   +2 more sources

Understanding Higher-Order Interactions in Information Space [PDF]

open access: yesEntropy
Methods used in topological data analysis naturally capture higher-order interactions in point cloud data embedded in a metric space. This methodology was recently extended to data living in an information space, by which we mean a space measured with an
Herbert Edelsbrunner   +2 more
doaj   +2 more sources

Revisiting Chernoff Information with Likelihood Ratio Exponential Families [PDF]

open access: yesEntropy, 2022
The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance.
Frank Nielsen
doaj   +2 more sources

Divergences Induced by the Cumulant and Partition Functions of Exponential Families and Their Deformations Induced by Comparative Convexity [PDF]

open access: yesEntropy
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or ...
Frank Nielsen
doaj   +2 more sources

Fast Proxy Centers for the Jeffreys Centroid: The Jeffreys–Fisher–Rao Center and the Gauss–Bregman Inductive Center [PDF]

open access: yesEntropy
The symmetric Kullback–Leibler centroid, also called the Jeffreys centroid, of a set of mutually absolutely continuous probability distributions on a measure space provides a notion of centrality which has proven useful in many tasks, including ...
Frank Nielsen
doaj   +2 more sources

Home - About - Disclaimer - Privacy