Results 61 to 70 of about 58,822 (196)
In this paper, we introduce properly-invariant diagonality measures of Hermitian positive-definite matrices. These diagonality measures are defined as distances or divergences between a given positive-definite matrix and its diagonal part.
Alyani, Khaled +2 more
core +3 more sources
This work introduces a novel framework for identifying non‐small cell lung cancer biomarkers from hundreds of volatile organic compounds in breath, analyzed via gas chromatography‐mass spectrometry. This method integrates generative data augmentation and multi‐view feature selection, providing a stable and accurate solution for biomarker discovery in ...
Guancheng Ren +10 more
wiley +1 more source
Information theoretical approach to detecting quantum gravitational corrections
In this paper, we investigate the scales at which quantum gravitational corrections can be detected in a black hole using information theory. This is done by calculating the Kullback-Leibler divergence for the probability distributions obtained from the ...
Behnam Pourhassan +7 more
doaj +1 more source
A Hybrid Transfer Learning Framework for Brain Tumor Diagnosis
A novel hybrid transfer learning approach for brain tumor classification achieves 99.47% accuracy using magnetic resonance imaging (MRI) images. By combining image preprocessing, ensemble deep learning, and explainable artificial intelligence (XAI) techniques like gradient‐weighted class activation mapping and SHapley Additive exPlanations (SHAP), the ...
Sadia Islam Tonni +11 more
wiley +1 more source
Gaussian Approximations of Small Noise Diffusions in Kullback-Leibler Divergence [PDF]
We study Gaussian approximations to the distribution of a diffusion. The approximations are easy to compute: they are defined by two simple ordinary differential equations for the mean and the covariance.
Sanz-Alonso, Daniel, Stuart, Andrew M.
core +1 more source
Information Measures: the Curious Case of the Binary Alphabet
Four problems related to information divergence measures defined on finite alphabets are considered. In three of the cases we consider, we illustrate a contrast which arises between the binary-alphabet and larger-alphabet settings.
Courtade, Thomas +4 more
core +1 more source
Variational Autoencoder+Deep Deterministic Policy Gradient addresses low‐light failures of infrared depth sensing for indoor robot navigation. Stage 1 pretrains an attention‐enhanced Variational Autoencoder (Convolutional Block Attention Module+Feature Pyramid Network) to map dark depth frames to a well‐lit reconstruction, yielding a 128‐D latent code ...
Uiseok Lee +7 more
wiley +1 more source
This study presents a new sampling‐based model predictive control minimizing reverse Kullback‐Leibler divergence to quickly find a local optimum. In addition, a modified Nesterov's acceleration method is introduced for faster convergence. The method is effective for real‐time simulations and real‐world operability improvement on a force‐driven mobile ...
Taisuke Kobayashi, Kota Fukumoto
wiley +1 more source
Restricted Tweedie stochastic block models
Abstract The stochastic block model (SBM) is a widely used framework for community detection in networks, where the network structure is typically represented by an adjacency matrix. However, conventional SBMs are not directly applicable to an adjacency matrix that consists of nonnegative zero‐inflated continuous edge weights.
Jie Jian, Mu Zhu, Peijun Sang
wiley +1 more source
On the symmetrized s-divergence
In this study, we work with the relative divergence of type s,s∈ℝs,s\in {\mathbb{R}}, which includes the Kullback-Leibler divergence and the Hellinger and χ 2 distances as particular cases.
Simić Slavko +2 more
doaj +1 more source

