Results 11 to 20 of about 11,282 (244)

Rényi Divergence and Kullback-Leibler Divergence [PDF]

open access: yesIEEE Transactions on Information Theory, 2014
R nyi divergence is related to R nyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by R nyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order.
van Erven, Tim, Harremoës, Peter
openaire   +4 more sources

Kullback Leibler divergence in complete bacterial and phage genomes [PDF]

open access: yesPeerJ, 2017
The amino acid content of the proteins encoded by a genome may predict the coding potential of that genome and may reflect lifestyle restrictions of the organism.
Sajia Akhter   +5 more
doaj   +3 more sources

A decision cognizant Kullback–Leibler divergence

open access: yesPattern Recognition, 2017
In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback–Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback–Leibler divergence
Ponti, M   +4 more
openaire   +7 more sources

Local inconsistency detection using the Kullback–Leibler divergence measure [PDF]

open access: yesSystematic Reviews
Background The standard approach to local inconsistency assessment typically relies on testing the conflict between the direct and indirect evidence in selected treatment comparisons.
Loukia M. Spineli
doaj   +2 more sources

Balancing Reconstruction Error and Kullback-Leibler Divergence in Variational Autoencoders [PDF]

open access: goldIEEE Access, 2020
Likelihood-based generative frameworks are receiving increasing attention in the deep learning community, mostly on account of their strong probabilistic foundation.
Andrea Asperti, Matteo Trentin
doaj   +2 more sources

Some Order Preserving Inequalities for Cross Entropy and Kullback–Leibler Divergence [PDF]

open access: yesEntropy, 2018
Cross entropy and Kullback⁻Leibler (K-L) divergence are fundamental quantities of information theory, and they are widely used in many fields. Since cross entropy is the negated logarithm of likelihood, minimizing cross entropy is equivalent to ...
Mateu Sbert   +3 more
doaj   +2 more sources

Bounds for Kullback-Leibler divergence

open access: yesElectronic Journal of Differential Equations, 2016
Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy $D(p||q)$ of two probability distributions
Pantelimon G. Popescu   +3 more
doaj   +2 more sources

Optimism in reinforcement learning and Kullback-Leibler divergence [PDF]

open access: green2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton), 2010
We consider model-based reinforcement learning in finite Markov De- cision Processes (MDPs), focussing on so-called optimistic strategies. In MDPs, optimism can be implemented by carrying out extended value it- erations under a constraint of consistency with the estimated model tran- sition probabilities. The UCRL2 algorithm by Auer, Jaksch and Ortner (
Sarah Filippi   +2 more
openalex   +4 more sources

Efficient ECG classification based on the probabilistic Kullback-Leibler divergence

open access: goldInformatics in Medicine Unlocked
Diagnostic systems of cardiac arrhythmias face early and accurate detection challenges due to the overlap of electrocardiogram (ECG) patterns. Additionally, these systems must manage a huge number of features.
Dhiah Al-Shammary   +5 more
doaj   +2 more sources

Home - About - Disclaimer - Privacy