Results 71 to 80 of about 11,282 (244)

Quantizations Preserving Kullback-Leibler Divergence

open access: yes, 2018
International Zurich Seminar on Information and Communication (IZS 2018 ...
Huleihel, Wasim   +2 more
openaire   +2 more sources

VAE+DDPG: An Attention‐Enhanced Variational Autoencoder for Deep Reinforcement Learning‐Based Autonomous Navigation in Low‐Light Environments

open access: yesAdvanced Intelligent Systems, EarlyView.
Variational Autoencoder+Deep Deterministic Policy Gradient addresses low‐light failures of infrared depth sensing for indoor robot navigation. Stage 1 pretrains an attention‐enhanced Variational Autoencoder (Convolutional Block Attention Module+Feature Pyramid Network) to map dark depth frames to a well‐lit reconstruction, yielding a 128‐D latent code ...
Uiseok Lee   +7 more
wiley   +1 more source

Decoupled Kullback-Leibler Divergence Loss

open access: yesAdvances in Neural Information Processing Systems 37
In this paper, we delve deeper into the Kullback-Leibler (KL) Divergence loss and mathematically prove that it is equivalent to the Decoupled Kullback-Leibler (DKL) Divergence loss that consists of 1) a weighted Mean Square Error (wMSE) loss and 2) a Cross-Entropy loss incorporating soft labels. Thanks to the decomposed formulation of DKL loss, we have
Cui, Jiequan   +5 more
openaire   +2 more sources

Model Fusion with Kullback--Leibler Divergence

open access: yes, 2020
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors and proceeds using a simple assign-and-average approach.
Claici, Sebastian   +3 more
openaire   +2 more sources

Ellipticity and Circularity Measuring via Kullback–Leibler Divergence [PDF]

open access: yesJournal of Mathematical Imaging and Vision, 2015
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Misztal, Krzysztof, Tabor, Jacek
openaire   +2 more sources

Real‐Time Sampling‐Based Model Predictive Control Based on Reverse Kullback–Leibler Divergence and Its Adaptive Acceleration

open access: yesAdvanced Intelligent Systems, EarlyView.
This study presents a new sampling‐based model predictive control minimizing reverse Kullback‐Leibler divergence to quickly find a local optimum. In addition, a modified Nesterov's acceleration method is introduced for faster convergence. The method is effective for real‐time simulations and real‐world operability improvement on a force‐driven mobile ...
Taisuke Kobayashi, Kota Fukumoto
wiley   +1 more source

Restricted Tweedie stochastic block models

open access: yesCanadian Journal of Statistics, EarlyView.
Abstract The stochastic block model (SBM) is a widely used framework for community detection in networks, where the network structure is typically represented by an adjacency matrix. However, conventional SBMs are not directly applicable to an adjacency matrix that consists of nonnegative zero‐inflated continuous edge weights.
Jie Jian, Mu Zhu, Peijun Sang
wiley   +1 more source

On the symmetrized s-divergence

open access: yesOpen Mathematics, 2020
In this study, we work with the relative divergence of type s,s∈ℝs,s\in {\mathbb{R}}, which includes the Kullback-Leibler divergence and the Hellinger and χ 2 distances as particular cases.
Simić Slavko   +2 more
doaj   +1 more source

Covariance Structure Modeling of Engineering Demand Parameters in Cloud‐Based Seismic Analysis

open access: yesEarthquake Engineering &Structural Dynamics, EarlyView.
ABSTRACT Probabilistic seismic demand modeling aims to estimate structural demand as a function of ground motion intensity—a critical stage in seismic risk assessment. Although many models exist to describe the structural demand, few consider the covariance among engineering demand parameters, potentially overlooking a key factor in improving the ...
Archie Rudman   +3 more
wiley   +1 more source

Home - About - Disclaimer - Privacy