Results 21 to 30 of about 58,822 (196)
Kullback–Leibler Divergence and Mutual Information of Partitions in Product MV Algebras
The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties of ...
Dagmar Markechová, Beloslav Riečan
doaj +1 more source
Time series irreversibility: a visibility graph approach [PDF]
We propose a method to measure real-valued time series irreversibility which combines two differ- ent tools: the horizontal visibility algorithm and the Kullback-Leibler divergence.
A. Nuñez +33 more
core +3 more sources
Some bounds for skewed α-Jensen-Shannon divergence
Based on the skewed Kullback-Leibler divergence introduced in the natural language processing, we derive the upper and lower bounds on the skewed version of the Jensen-Shannon divergence and investigate properties of them.
Takuya Yamano
doaj +1 more source
Monotonic decrease of the quantum nonadditive divergence by projective measurements [PDF]
Nonadditive (nonextensive) generalization of the quantum Kullback-Leibler divergence, termed the quantum q-divergence, is shown not to increase by projective measurements in an elementary manner.Comment: 10 pages, no figures.
Abe +27 more
core +2 more sources
Dynamic fine‐tuning layer selection using Kullback–Leibler divergence
The selection of layers in the transfer learning fine‐tuning process ensures a pre‐trained model's accuracy and adaptation in a new target domain. However, the selection process is still manual and without clearly defined criteria. If the wrong layers in
Raphael Ngigi Wanjiku +2 more
doaj +1 more source
The family of α-divergences including the oriented forward and reverse Kullback–Leibler divergences is often used in signal processing, pattern recognition, and machine learning, among others.
Frank Nielsen
doaj +1 more source
Fault-tolerant relative navigation based on Kullback–Leibler divergence
A fault-detection method for relative navigation based on Kullback–Leibler divergence (KLD) is proposed. Different from the traditional χ 2 -based approaches, the KLD for a filter is following a hybrid distribution that combines χ 2 distribution and F ...
Jun Xiong +6 more
doaj +1 more source
Zipf–Mandelbrot law, f-divergences and the Jensen-type interpolating inequalities
Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf–Mandelbrot law applied to various types of f-divergences and distances, such ...
Neda Lovričević +2 more
doaj +1 more source
Android Malware Detection Using Kullback-Leibler Divergence
Many recent reports suggest that mareware applications cause high billing to victims by sending and receiving hidden SMS messages. Given that, there is a need to develop necessary technique to identify malicious SMS operations as well as differentiate ...
Vanessa N. COOPER +2 more
doaj +1 more source
R\'enyi Divergence and Kullback-Leibler Divergence
R\'enyi divergence is related to R\'enyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings.
Harremoës, Peter, van Erven, Tim
core +2 more sources

