Results 1 to 10 of about 18,515 (67)
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian
Nielsen, Frank
core +3 more sources
ON THE JENSEN-SHANNON DIVERGENCE AND THE VARIATION DISTANCE FOR CATEGORICAL PROBABILITY DISTRIBUTIONS [PDF]
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence.
Corander, Jukka +2 more
core +3 more sources
We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius.
Frank Nielsen
doaj +1 more source
In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results.
Omid Kharazmi +1 more
doaj +1 more source
Quantifying the Dissimilarity of Texts
Quantifying the dissimilarity of two texts is an important aspect of a number of natural language processing tasks, including semantic information retrieval, topic classification, and document clustering.
Benjamin Shade, Eduardo G. Altmann
doaj +1 more source
Refined Young Inequality and Its Application to Divergences
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also studied some properties on the difference between the weighted
Shigeru Furuichi, Nicuşor Minculete
doaj +1 more source
Fault Detection Based on Multi-Dimensional KDE and Jensen–Shannon Divergence
Weak fault signals, high coupling data, and unknown faults commonly exist in fault diagnosis systems, causing low detection and identification performance of fault diagnosis methods based on T2 statistics or cross entropy. This paper proposes a new fault
Juhui Wei +4 more
doaj +1 more source
On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid
The Jensen−Shannon divergence is a renown bounded symmetrization of the Kullback−Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar
Frank Nielsen
doaj +1 more source
A method for continuous-range sequence analysis with Jensen-Shannon divergence
Mutual Information (MI) is a useful Information Theory tool for the recognition of mutual dependence between data sets. Several methods have been developed fore estimation of MI when both data sets are of the discrete type or when both are of the ...
Miguel Ángel Ré +1 more
doaj +1 more source
The performance of a free-space optical (FSO) communications link suffers from the deleterious effects of weather conditions and atmospheric turbulence. In order to better estimate the reliability and availability of an FSO link, a suitable distribution ...
Antonios Lionis +3 more
doaj +1 more source

