Results 31 to 40 of about 11,282 (244)

Kullback–Leibler divergence for Bayesian nonparametric model checking [PDF]

open access: yesJournal of the Korean Statistical Society, 2020
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback-Leibler divergence is still an open problem.
Luai Al-Labadi   +3 more
openaire   +3 more sources

Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

open access: yesEntropy, 2016
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks.
Frank Nielsen, Ke Sun
doaj   +1 more source

Some bounds for skewed α-Jensen-Shannon divergence

open access: yesResults in Applied Mathematics, 2019
Based on the skewed Kullback-Leibler divergence introduced in the natural language processing, we derive the upper and lower bounds on the skewed version of the Jensen-Shannon divergence and investigate properties of them.
Takuya Yamano
doaj   +1 more source

Dynamic fine‐tuning layer selection using Kullback–Leibler divergence

open access: yesEngineering Reports, 2023
The selection of layers in the transfer learning fine‐tuning process ensures a pre‐trained model's accuracy and adaptation in a new target domain. However, the selection process is still manual and without clearly defined criteria. If the wrong layers in
Raphael Ngigi Wanjiku   +2 more
doaj   +1 more source

Generalizing the Alpha-Divergences and the Oriented Kullback–Leibler Divergences with Quasi-Arithmetic Means

open access: yesAlgorithms, 2022
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others.
Frank Nielsen
doaj   +1 more source

Markov-Switching Model Selection Using Kullback-Leibler Divergence [PDF]

open access: yesSSRN Electronic Journal, 2005
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Smith, Aaron   +2 more
openaire   +4 more sources

On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means

open access: yesEntropy, 2019
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution.
Frank Nielsen
doaj   +1 more source

Fault-tolerant relative navigation based on Kullback–Leibler divergence

open access: yesInternational Journal of Advanced Robotic Systems, 2020
A fault-detection method for relative navigation based on Kullback–Leibler divergence (KLD) is proposed. Different from the traditional χ 2 -based approaches, the KLD for a filter is following a hybrid distribution that combines χ 2 distribution and F ...
Jun Xiong   +6 more
doaj   +1 more source

Zipf–Mandelbrot law, f-divergences and the Jensen-type interpolating inequalities

open access: yesJournal of Inequalities and Applications, 2018
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf–Mandelbrot law applied to various types of f-divergences and distances, such ...
Neda Lovričević   +2 more
doaj   +1 more source

Android Malware Detection Using Kullback-Leibler Divergence

open access: yesAdvances in Distributed Computing and Artificial Intelligence Journal, 2015
Many recent reports suggest that mareware applications cause high billing to victims by sending and receiving hidden SMS messages. Given that, there is a need to develop necessary technique to identify malicious SMS operations as well as differentiate ...
Vanessa N. COOPER   +2 more
doaj   +1 more source

Home - About - Disclaimer - Privacy