Results 21 to 30 of about 11,282 (244)

The AIC Criterion and Symmetrizing the Kullback–Leibler Divergence

open access: greenIEEE Transactions on Neural Networks, 2007
Abd‐Krim Seghouane   +1 more
openalex   +4 more sources

Statistical Divergences between Densities of Truncated Exponential Families with Nested Supports: Duo Bregman and Duo Jensen Divergences

open access: yesEntropy, 2022
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence.
Frank Nielsen
doaj   +1 more source

The Kullback-Leibler Divergence Class in Decoding the Chest Sound Pattern [PDF]

open access: yesInformatică economică, 2019
Kullback-Leibler Divergence Class or relative entropy is a special case of broader divergence. It represents a calculation of how one probability distribution diverges from another one, expected probability distribution. Kullback-Leibler divergence has a
Antonio CLIM, Razvan Daniel ZOTA
doaj   +1 more source

Chained Kullback-Leibler divergences [PDF]

open access: yes2016 IEEE International Symposium on Information Theory (ISIT), 2016
We define and characterize the "chained" Kullback-Leibler divergence min w D(p‖w) + D(w‖q) minimized over all intermediate distributions w and the analogous k-fold chained K-L divergence min D(p‖wk-1) + … + D(w2‖w1) + D(w1‖q) minimized over the entire path (w1,…,wk-1).
Dmitri S, Pavlichin, Tsachy, Weissman
openaire   +2 more sources

ESTIMATION OF THE KULLBACK-LEIBLER DIVERGENCE

open access: green, 2003
The Kullback-Leibler (KL) divergence K(fl, P) between a set O of probability measures (PMs) on ]R d and some PM P cannot be estimated by K(O, Pn) when O contains PM whose support is not included in the support of the empirical measure Pn. We propose an estimation procedure which avoids any smoothing of Pn.
Michel Broniatowski
openalex   +3 more sources

Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model

open access: yesComplexity, 2022
In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting ...
Wanbo Lu, Wenhui Shi
doaj   +1 more source

On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds

open access: yesEntropy, 2020
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat ...
Frank Nielsen
doaj   +1 more source

Kullback-Leibler Divergence and Mutual Information of Experiments in the Fuzzy Case

open access: yesAxioms, 2017
The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures.
Dagmar Markechová
doaj   +1 more source

Kullback–Leibler Divergence and Mutual Information of Partitions in Product MV Algebras

open access: yesEntropy, 2017
The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties of ...
Dagmar Markechová, Beloslav Riečan
doaj   +1 more source

Home - About - Disclaimer - Privacy