Results 181 to 190 of about 10,137 (199)
Some of the next articles are maybe not open access.

Use of Kullback–Leibler divergence for forgetting

International Journal of Adaptive Control and Signal Processing, 2008
AbstractNon‐symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat.1979;7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result.
Kárný, Miroslav, Andrýsek, Josef
openaire   +1 more source

Distributions of the Kullback–Leibler divergence with applications

British Journal of Mathematical and Statistical Psychology, 2011
The Kullback–Leibler divergence (KLD) is a widely used method for measuring the fit of two distributions. In general, the distribution of the KLD is unknown. Under reasonable assumptions, common in psychometrics, the distribution of the KLD is shown to be asymptotically distributed as a scaled (non‐central) chi‐square with one ...
Belov, Dmitry I., Armstrong, Ronald D.
openaire   +3 more sources

Quantile-based cumulative Kullback–Leibler divergence

Statistics, 2017
ABSTRACTThe paper introduces a quantile-based cumulative Kullback–Leibler divergence and study its various properties. Unlike the distribution function approach, the quantile-based measure possesses some unique properties. The quantile functions used in many applied works do not have any tractable distribution functions where the proposed measure is a ...
S. M. Sunoj   +2 more
openaire   +1 more source

Information Filtering Using Kullback-Leibler Divergence

IEEJ Transactions on Electronics, Information and Systems, 2005
In this paper we describe an information filtering system using the Kullback-Leibler divergence. To cope with information flood, many information filtering systems have been proposed up to now. Since almost all information filtering systems are developed with techniques of information retrieval, machine learning, and pattern recognition, they often use
Hidekazu Yanagimoto, Sigeru Omatu
openaire   +1 more source

Estimating the Kullback–Leibler Divergence

2014
We now investigate how the KLD rate can be estimated from a single empirical stationary trajectory, obtained from a stochastic stationary process whose dynamics is unknown. We assume that the empirical stationary trajectory contains \(n\) data of one or several random variables denoted by the letter \(X\).
openaire   +1 more source

Robust Active Stereo Vision Using Kullback-Leibler Divergence

IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012
Active stereo vision is a method of 3D surface scanning involving the projecting and capturing of a series of light patterns where depth is derived from correspondences between the observed and projected patterns. In contrast, passive stereo vision reveals depth through correspondences between textured images from two or more cameras.
Yongchang, Wang   +5 more
openaire   +2 more sources

Estimation of Kullback–Leibler Divergence by Local Likelihood

Annals of the Institute of Statistical Mathematics, 2006
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Lee, Young Kyung, Park, Byeong U.
openaire   +2 more sources

Source Resolvability with Kullback-Leibler Divergence

2018 IEEE International Symposium on Information Theory (ISIT), 2018
The first- and second-order optimum achievable rates in the source resolvability problem are considered for general sources. In the literature, the achievable rates in the resolvability problem with respect to the variational distance as well as the normalized Kullback-Leibler (KL) divergence have already been analyzed. On the other hand, in this study
openaire   +1 more source

Generalized Kullback-Leibler Divergence

2017
The Kullback-Leibler divergence is a measure of how one probability distribution diverges from another, expected probability distribution. It is also called relative entropy and K-L distance. However, the word distance is incorrectly used, since K-L divergence does not have all metric properties.
Pokaz, Dora, Pečarić, Josip
openaire   +1 more source

Parameter identifiability with Kullback–Leibler information divergence criterion

International Journal of Adaptive Control and Signal Processing, 2008
AbstractWe study the problem of parameter identifiability with Kullback–Leibler information divergence (KLID) criterion. The KLID‐identifiability is defined, which can be related to many other concepts of identifiability, such as the identifiability with Fisher's information matrix criterion, identifiability with least‐squares criterion, and ...
Chen, Badong   +3 more
openaire   +2 more sources

Home - About - Disclaimer - Privacy