Results 11 to 20 of about 10,137 (199)
Local inconsistency detection using the Kullback–Leibler divergence measure [PDF]
Background The standard approach to local inconsistency assessment typically relies on testing the conflict between the direct and indirect evidence in selected treatment comparisons.
Loukia M. Spineli
doaj +2 more sources
Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence [PDF]
Cross entropy and Kullback⁻Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to ...
Mateu Sbert +3 more
doaj +2 more sources
Bounds for Kullback-Leibler divergence
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy $D(p||q)$ of two probability distributions
Pantelimon G. Popescu +3 more
doaj +2 more sources
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence.
Frank Nielsen
doaj +1 more source
The Kullback-Leibler Divergence Class in Decoding the Chest Sound Pattern [PDF]
Kullback-Leibler Divergence Class or relative entropy is a special case of broader divergence. It represents a calculation of how one probability distribution diverges from another one, expected probability distribution. Kullback-Leibler divergence has a
Antonio CLIM, Razvan Daniel ZOTA
doaj +1 more source
Chained Kullback-Leibler divergences [PDF]
We define and characterize the "chained" Kullback-Leibler divergence min w D(p‖w) + D(w‖q) minimized over all intermediate distributions w and the analogous k-fold chained K-L divergence min D(p‖wk-1) + … + D(w2‖w1) + D(w1‖q) minimized over the entire path (w1,…,wk-1).
Dmitri S, Pavlichin, Tsachy, Weissman
openaire +2 more sources
Rényi Divergence and Kullback-Leibler Divergence [PDF]
R nyi divergence is related to R nyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by R nyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order.
van Erven, Tim, Harremoës, Peter
openaire +2 more sources
On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat ...
Frank Nielsen
doaj +1 more source
Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model
In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting ...
Wanbo Lu, Wenhui Shi
doaj +1 more source
Kullback-Leibler Divergence and Mutual Information of Experiments in the Fuzzy Case
The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures.
Dagmar Markechová
doaj +1 more source

