Results 21 to 30 of about 22,391 (177)
The purpose of this study is to investigate the relationship between the Shannon entropy procedure and the Jensen–Shannon divergence (JSD) that are used as item selection criteria in cognitive diagnostic computerized adaptive testing (CD-CAT).
Wenyi Wang +4 more
doaj +2 more sources
ON THE JENSEN-SHANNON DIVERGENCE AND THE VARIATION DISTANCE FOR CATEGORICAL PROBABILITY DISTRIBUTIONS [PDF]
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence.
Corander, Jukka +2 more
core +3 more sources
In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results.
Omid Kharazmi +1 more
doaj +1 more source
Quantifying the Dissimilarity of Texts
Quantifying the dissimilarity of two texts is an important aspect of a number of natural language processing tasks, including semantic information retrieval, topic classification, and document clustering.
Benjamin Shade, Eduardo G. Altmann
doaj +1 more source
Refined Young Inequality and Its Application to Divergences
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also studied some properties on the difference between the weighted
Shigeru Furuichi, Nicuşor Minculete
doaj +1 more source
Natural history of liver disease in a large international cohort of children with Alagille syndrome: Results from the GALA study. Abstract Background and Aims Alagille syndrome (ALGS) is a multisystem disorder, characterized by cholestasis. Existing outcome data are largely derived from tertiary centers, and real‐world data are lacking.
Shannon M. Vandriel +93 more
wiley +1 more source
A method for continuous-range sequence analysis with Jensen-Shannon divergence
Mutual Information (MI) is a useful Information Theory tool for the recognition of mutual dependence between data sets. Several methods have been developed fore estimation of MI when both data sets are of the discrete type or when both are of the ...
Miguel Ángel Ré +1 more
doaj +1 more source
The Representation Jensen-Shannon Divergence
Quantifying the difference between probability distributions is crucial in machine learning. However, estimating statistical divergences from empirical samples is challenging due to unknown underlying distributions. This work proposes the representation Jensen-Shannon divergence (RJSD), a novel measure inspired by the traditional Jensen-Shannon ...
Hoyos-Osorio, Jhoan K. +1 more
openaire +2 more sources
Jensen–Shannon divergence and non-linear quantum dynamics [PDF]
19 ...
Molladavoudi, Saeid +2 more
openaire +2 more sources
On the Jensen–Shannon Divergence and Variational Distance [PDF]
Summary: We study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known \(L_1\)-metric. We show that several important results and constructions in computational complexity under the \(L_1\)-metric carry over to the new metric, such as Yao ...
Tsai, Shi-Chun +2 more
openaire +1 more source

