Results 21 to 30 of about 10,137 (199)
Kullback–Leibler Divergence and Mutual Information of Partitions in Product MV Algebras
The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties of ...
Dagmar Markechová, Beloslav Riečan
doaj +1 more source
Kullback–Leibler divergence for Bayesian nonparametric model checking [PDF]
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback-Leibler divergence is still an open problem.
Luai Al-Labadi +3 more
openaire +3 more sources
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks.
Frank Nielsen, Ke Sun
doaj +1 more source
Some bounds for skewed α-Jensen-Shannon divergence
Based on the skewed Kullback-Leibler divergence introduced in the natural language processing, we derive the upper and lower bounds on the skewed version of the Jensen-Shannon divergence and investigate properties of them.
Takuya Yamano
doaj +1 more source
Dynamic fine‐tuning layer selection using Kullback–Leibler divergence
The selection of layers in the transfer learning fine‐tuning process ensures a pre‐trained model's accuracy and adaptation in a new target domain. However, the selection process is still manual and without clearly defined criteria. If the wrong layers in
Raphael Ngigi Wanjiku +2 more
doaj +1 more source
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others.
Frank Nielsen
doaj +1 more source
Markov-Switching Model Selection Using Kullback-Leibler Divergence [PDF]
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Smith, Aaron +2 more
openaire +4 more sources
On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution.
Frank Nielsen
doaj +1 more source
A decision cognizant Kullback–Leibler divergence
In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback–Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback–Leibler divergence
Ponti, M +4 more
openaire +5 more sources
Fault-tolerant relative navigation based on Kullback–Leibler divergence
A fault-detection method for relative navigation based on Kullback–Leibler divergence (KLD) is proposed. Different from the traditional χ 2 -based approaches, the KLD for a filter is following a hybrid distribution that combines χ 2 distribution and F ...
Jun Xiong +6 more
doaj +1 more source

