Results 31 to 40 of about 11,282 (244)
Kullback–Leibler divergence for Bayesian nonparametric model checking [PDF]
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback-Leibler divergence is still an open problem.
Luai Al-Labadi +3 more
openaire +3 more sources
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks.
Frank Nielsen, Ke Sun
doaj +1 more source
Some bounds for skewed α-Jensen-Shannon divergence
Based on the skewed Kullback-Leibler divergence introduced in the natural language processing, we derive the upper and lower bounds on the skewed version of the Jensen-Shannon divergence and investigate properties of them.
Takuya Yamano
doaj +1 more source
Dynamic fine‐tuning layer selection using Kullback–Leibler divergence
The selection of layers in the transfer learning fine‐tuning process ensures a pre‐trained model's accuracy and adaptation in a new target domain. However, the selection process is still manual and without clearly defined criteria. If the wrong layers in
Raphael Ngigi Wanjiku +2 more
doaj +1 more source
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others.
Frank Nielsen
doaj +1 more source
Markov-Switching Model Selection Using Kullback-Leibler Divergence [PDF]
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Smith, Aaron +2 more
openaire +4 more sources
On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution.
Frank Nielsen
doaj +1 more source
Fault-tolerant relative navigation based on Kullback–Leibler divergence
A fault-detection method for relative navigation based on Kullback–Leibler divergence (KLD) is proposed. Different from the traditional χ 2 -based approaches, the KLD for a filter is following a hybrid distribution that combines χ 2 distribution and F ...
Jun Xiong +6 more
doaj +1 more source
Zipf–Mandelbrot law, f-divergences and the Jensen-type interpolating inequalities
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf–Mandelbrot law applied to various types of f-divergences and distances, such ...
Neda Lovričević +2 more
doaj +1 more source
Android Malware Detection Using Kullback-Leibler Divergence
Many recent reports suggest that mareware applications cause high billing to victims by sending and receiving hidden SMS messages. Given that, there is a need to develop necessary technique to identify malicious SMS operations as well as differentiate ...
Vanessa N. COOPER +2 more
doaj +1 more source

