Results 11 to 20 of about 58,822 (196)
Bounds for Kullback-Leibler divergence [PDF]
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy $D(p||q)$ of two probability distributions
Pantelimon G. Popescu +3 more
doaj +2 more sources
Local inconsistency detection using the Kullback–Leibler divergence measure [PDF]
Background The standard approach to local inconsistency assessment typically relies on testing the conflict between the direct and indirect evidence in selected treatment comparisons.
Loukia M. Spineli
doaj +2 more sources
Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence [PDF]
Cross entropy and Kullback⁻Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to ...
Mateu Sbert +3 more
doaj +2 more sources
By calculating the Kullback–Leibler divergence between two probability measures belonging to different exponential families dominated by the same measure, we obtain a formula that generalizes the ordinary Fenchel–Young divergence.
Frank Nielsen
doaj +1 more source
The Kullback-Leibler Divergence Class in Decoding the Chest Sound Pattern [PDF]
Kullback-Leibler Divergence Class or relative entropy is a special case of broader divergence. It represents a calculation of how one probability distribution diverges from another one, expected probability distribution. Kullback-Leibler divergence has a
Antonio CLIM, Razvan Daniel ZOTA
doaj +1 more source
Parameter Estimation Based on Cumulative Kullback–Leibler Divergence
In this paper, we propose some estimators for the parameters of a statistical model based on Kullback–Leibler divergence of the survival function in continuous setting and apply it to type I censored data.
Yaser Mehrali , Majid Asadi
doaj +1 more source
Statistical Estimation of the Kullback–Leibler Divergence
Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of the Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) the Lebesgue measure.
Alexander Bulinski, Denis Dimitrov
doaj +1 more source
Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model
In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting ...
Wanbo Lu, Wenhui Shi
doaj +1 more source
On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat ...
Frank Nielsen
doaj +1 more source
Kullback-Leibler Divergence and Mutual Information of Experiments in the Fuzzy Case
The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures.
Dagmar Markechová
doaj +1 more source

