Rényi Divergence and Kullback-Leibler Divergence [PDF]
R nyi divergence is related to R nyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by R nyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order.
van Erven, Tim, Harremoës, Peter
openaire +4 more sources
Kullback Leibler divergence in complete bacterial and phage genomes [PDF]
The amino acid content of the proteins encoded by a genome may predict the coding potential of that genome and may reflect lifestyle restrictions of the organism.
Sajia Akhter +5 more
doaj +3 more sources
A decision cognizant Kullback–Leibler divergence
In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback–Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback–Leibler divergence
Ponti, M +4 more
openaire +7 more sources
Local inconsistency detection using the Kullback–Leibler divergence measure [PDF]
Background The standard approach to local inconsistency assessment typically relies on testing the conflict between the direct and indirect evidence in selected treatment comparisons.
Loukia M. Spineli
doaj +2 more sources
Balancing Reconstruction Error and Kullback-Leibler Divergence in Variational Autoencoders [PDF]
Likelihood-based generative frameworks are receiving increasing attention in the deep learning community, mostly on account of their strong probabilistic foundation.
Andrea Asperti, Matteo Trentin
doaj +2 more sources
Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence [PDF]
Cross entropy and Kullback⁻Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to ...
Mateu Sbert +3 more
doaj +2 more sources
Bounds for Kullback-Leibler divergence
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy $D(p||q)$ of two probability distributions
Pantelimon G. Popescu +3 more
doaj +2 more sources
Optimism in reinforcement learning and Kullback-Leibler divergence [PDF]
We consider model-based reinforcement learning in finite Markov De- cision Processes (MDPs), focussing on so-called optimistic strategies. In MDPs, optimism can be implemented by carrying out extended value it- erations under a constraint of consistency with the estimated model tran- sition probabilities. The UCRL2 algorithm by Auer, Jaksch and Ortner (
Sarah Filippi +2 more
openalex +4 more sources
Measuring Synchronization between Spikes and Local Field Potential Based on the Kullback-Leibler Divergence. [PDF]
Yin L, Zhang G, Yin F.
europepmc +3 more sources
Efficient ECG classification based on the probabilistic Kullback-Leibler divergence
Diagnostic systems of cardiac arrhythmias face early and accurate detection challenges due to the overlap of electrocardiogram (ECG) patterns. Additionally, these systems must manage a huge number of features.
Dhiah Al-Shammary +5 more
doaj +2 more sources

