Results 171 to 180 of about 10,137 (199)
Some of the next articles are maybe not open access.

Kullback–Leibler Divergence Metric Learning

IEEE Transactions on Cybernetics, 2022
The Kullback-Leibler divergence (KLD), which is widely used to measure the similarity between two distributions, plays an important role in many applications. In this article, we address the KLD metric-learning task, which aims at learning the best KLD-type metric from the distributions of datasets.
Shuyi Ji   +5 more
openaire   +2 more sources

The fractional Kullback–Leibler divergence

Journal of Physics A: Mathematical and Theoretical, 2021
Abstract The Kullback–Leibler divergence or relative entropy is generalised by deriving its fractional form. The conventional Kullback–Leibler divergence as well as other formulations emerge as special cases. It is shown that the fractional divergence encapsulates different relative entropy states via the manipulation of the fractional ...
openaire   +1 more source

Kullback–Leibler divergence for evaluating bioequivalence

Statistics in Medicine, 2003
AbstractIn this paper we propose a methodology for evaluating the bioequivalence of two formulations of a drug that encompasses not only average bioequivalence (ABE), but also the more recently introduced measures of population bioequivalence (PBE) and individual bioequivalence (IBE). The latter two measures are concerned with prescribability (PBE) and
Vladimir, Dragalin   +3 more
openaire   +2 more sources

Kullback–Leibler divergence: A quantile approach

Statistics & Probability Letters, 2016
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
P.G. Sankaran   +2 more
openaire   +2 more sources

Kullback-Leibler Divergence Revisited

Proceedings of the ACM SIGIR International Conference on Theory of Information Retrieval, 2017
Thee KL divergence is the most commonly used measure for comparing query and document language models in the language modeling framework to ad hoc retrieval. Since KL is rank equivalent to a specific weighted geometric mean, we examine alternative weighted means for language-model comparison, as well as alternative divergence measures.
Fiana Raiber, Oren Kurland
openaire   +1 more source

Kullback-Leibler Divergence-Based Visual Servoing

2021 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 2021
This paper proposes a Kullback-Leibler (K-L) divergence-based visual servoing scheme. K-L divergence, also known as relative entropy, is a measure of the difference between two probability distributions. By employing the K-L divergence as a new error metric to evaluate the similarity between the actual and desired images, and then formulating the ...
Xiangfei Li, Huan Zhao, Han Ding
openaire   +1 more source

Model parameter learning using Kullback–Leibler divergence

Physica A: Statistical Mechanics and its Applications, 2018
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Lin, Chungwei   +4 more
openaire   +1 more source

Acoustic environment identification by Kullback–Leibler divergence

Forensic Science International, 2017
This paper presents a forensic methodology that determines, from among a set of recording places, the probable place where allegedly a disputed digital audio recording was made. The methodology considers that digital audio recordings are noisy signals that have two involved noise components.
G, Delgado-Gutiérrez   +3 more
openaire   +2 more sources

Blind Deblurring of Barcodes via Kullback-Leibler Divergence

IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021
Barcode encoding schemes impose symbolic constraints which fix certain segments of the image. We present, implement, and assess a method for blind deblurring and denoising based entirely on Kullback-Leibler divergence. The method is designed to incorporate and exploit the full strength of barcode symbologies.
Gabriel Rioux   +4 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy