Results 51 to 60 of about 58,822 (196)
f-divergence Analysis of Generative Adversarial Network
We aim to establish estimation bounds for various divergences, including total variation, Kullback-Leibler (KL) divergence, Hellinger divergence, and Pearson χ2 divergence, within the GAN estimator.
Hasan Mahmud, Sang Hailin
doaj +1 more source
Uniqueness and Optimality of Dynamical Extensions of Divergences
We introduce an axiomatic approach for channel divergences and channel relative entropies that is based on three information-theoretic axioms of monotonicity under superchannels, i.e., generalized data processing inequality, additivity under tensor ...
Gilad Gour
doaj +1 more source
Some Properties of Weighted Tsallis and Kaniadakis Divergences
We are concerned with the weighted Tsallis and Kaniadakis divergences between two measures. More precisely, we find inequalities between these divergences and Tsallis and Kaniadakis logarithms, prove that they are limited by similar bounds with those ...
Răzvan-Cornel Sfetcu +2 more
doaj +1 more source
Generalised Mixability, Constant Regret, and Bayesian Updating [PDF]
Mixability of a loss is known to characterise when constant regret bounds are achievable in games of prediction with expert advice through the use of Vovk's aggregating algorithm.
Frongillo, Rafael M. +2 more
core
Bayesian optimization enabled the design of PA56 system with just 8 wt% additives, achieving limiting oxygen index 30.5%, tensile strength 80.9 MPa, and UL‐94 V‐0 rating. Without prior knowledge, the algorithm uncovered synergistic effects between aluminum diethyl‐phosphinate and nanoclay.
Burcu Ozdemir +4 more
wiley +1 more source
Rényi Entropy and Rényi Divergence in Product MV-Algebras
This article deals with new concepts in a product MV-algebra, namely, with the concepts of Rényi entropy and Rényi divergence. We define the Rényi entropy of order q of a partition in a product MV-algebra and its conditional version ...
Dagmar Markechová, Beloslav Riečan
doaj +1 more source
We provide optimal lower and upper bounds for the augmented Kullback-Leibler divergence in terms of the augmented total variation distance between two probability measures defined on two Euclidean spaces having different dimensions.
Michele Caprio
doaj +1 more source
Maximally Divergent Intervals for Anomaly Detection
We present new methods for batch anomaly detection in multivariate time series. Our methods are based on maximizing the Kullback-Leibler divergence between the data distribution within and outside an interval of the time series.
Barz, Björn +7 more
core +1 more source
This work presents a novel generative artificial intelligence (AI) framework for inverse alloy design through operations (optimization and diffusion) within learned compact latent space from variational autoencoder (VAE). The proposed work addresses challenges of limited data, nonuniqueness solutions, and high‐dimensional spaces.
Mohammad Abu‐Mualla +4 more
wiley +1 more source
Kullback–Leibler divergence research for the simulation of the credit scoring [PDF]
В роботі проведено теоретичне дослідження пошуку взаємозв’язку класичної відстані Кульбака-Лейблера та загальноприйнятих статистичних показників, що відображають щонайменше два напрямки практичного застосування у задачах бінарної класифікації, зокрема у
Soloshenko, Oleksandr +1 more
core

