Results 1 to 10 of about 18,515 (67)

On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means

open access: yesEntropy, 2020
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian
Nielsen, Frank
core   +3 more sources

ON THE JENSEN-SHANNON DIVERGENCE AND THE VARIATION DISTANCE FOR CATEGORICAL PROBABILITY DISTRIBUTIONS [PDF]

open access: yes, 2021
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence.
Corander, Jukka   +2 more
core   +3 more sources

On a Variational Definition for the Jensen-Shannon Symmetrization of Distances Based on the Information Radius

open access: yesEntropy, 2021
We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius.
Frank Nielsen
doaj   +1 more source

Discrete Versions of Jensen–Fisher, Fisher and Bayes–Fisher Information Measures of Finite Mixture Distributions

open access: yesEntropy, 2021
In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results.
Omid Kharazmi   +1 more
doaj   +1 more source

Quantifying the Dissimilarity of Texts

open access: yesInformation, 2023
Quantifying the dissimilarity of two texts is an important aspect of a number of natural language processing tasks, including semantic information retrieval, topic classification, and document clustering.
Benjamin Shade, Eduardo G. Altmann
doaj   +1 more source

Refined Young Inequality and Its Application to Divergences

open access: yesEntropy, 2021
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also studied some properties on the difference between the weighted
Shigeru Furuichi, Nicuşor Minculete
doaj   +1 more source

Fault Detection Based on Multi-Dimensional KDE and Jensen–Shannon Divergence

open access: yesEntropy, 2021
Weak fault signals, high coupling data, and unknown faults commonly exist in fault diagnosis systems, causing low detection and identification performance of fault diagnosis methods based on T2 statistics or cross entropy. This paper proposes a new fault
Juhui Wei   +4 more
doaj   +1 more source

On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid

open access: yesEntropy, 2020
The Jensen−Shannon divergence is a renown bounded symmetrization of the Kullback−Leibler divergence which does not require probability densities to have matching supports. In this paper, we introduce a vector-skew generalization of the scalar
Frank Nielsen
doaj   +1 more source

A method for continuous-range sequence analysis with Jensen-Shannon divergence

open access: yesPapers in Physics, 2021
Mutual Information (MI) is a useful Information Theory tool for the recognition of mutual dependence between data sets. Several methods have been developed fore estimation of MI when both data sets are of the discrete type or when both are of the ...
Miguel Ángel Ré   +1 more
doaj   +1 more source

RSSI Probability Density Functions Comparison Using Jensen-Shannon Divergence and Pearson Distribution

open access: yesTechnologies, 2021
The performance of a free-space optical (FSO) communications link suffers from the deleterious effects of weather conditions and atmospheric turbulence. In order to better estimate the reliability and availability of an FSO link, a suitable distribution ...
Antonios Lionis   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy