Results 71 to 80 of about 10,137 (199)
Balancing Reconstruction Error and Kullback-Leibler Divergence in Variational Autoencoders
Likelihood-based generative frameworks are receiving increasing attention in the deep learning community, mostly on account of their strong probabilistic foundation.
Andrea Asperti, Matteo Trentin
doaj +1 more source
This study presents a new sampling‐based model predictive control minimizing reverse Kullback‐Leibler divergence to quickly find a local optimum. In addition, a modified Nesterov's acceleration method is introduced for faster convergence. The method is effective for real‐time simulations and real‐world operability improvement on a force‐driven mobile ...
Taisuke Kobayashi, Kota Fukumoto
wiley +1 more source
Efficient ECG classification based on the probabilistic Kullback-Leibler divergence
Diagnostic systems of cardiac arrhythmias face early and accurate detection challenges due to the overlap of electrocardiogram (ECG) patterns. Additionally, these systems must manage a huge number of features.
Dhiah Al-Shammary +5 more
doaj +1 more source
The McMillan Theorem for Colored Branching Processes and Dimensions of Random Fractals
For the simplest colored branching process, we prove an analog to the McMillan theorem and calculate the Hausdorff dimensions of random fractals defined in terms of the limit behavior of empirical measures generated by finite genetic lines.
Victor Bakhtin
doaj +1 more source
Restricted Tweedie stochastic block models
Abstract The stochastic block model (SBM) is a widely used framework for community detection in networks, where the network structure is typically represented by an adjacency matrix. However, conventional SBMs are not directly applicable to an adjacency matrix that consists of nonnegative zero‐inflated continuous edge weights.
Jie Jian, Mu Zhu, Peijun Sang
wiley +1 more source
Covariance Structure Modeling of Engineering Demand Parameters in Cloud‐Based Seismic Analysis
ABSTRACT Probabilistic seismic demand modeling aims to estimate structural demand as a function of ground motion intensity—a critical stage in seismic risk assessment. Although many models exist to describe the structural demand, few consider the covariance among engineering demand parameters, potentially overlooking a key factor in improving the ...
Archie Rudman +3 more
wiley +1 more source
A Deep Learning Framework for Forecasting Medium‐Term Covariance in Multiasset Portfolios
ABSTRACT Forecasting the covariance matrix of asset returns is central to portfolio construction, risk management, and asset pricing. However, most existing models struggle at medium‐term horizons, several weeks to months, where shifting market regimes and slower dynamics prevail.
Pedro Reis, Ana Paula Serra, João Gama
wiley +1 more source
Generalized Kullback-Leibler Divergence Loss
extension of our NeurIPS paper "Decoupled Kullback-Leibler Divergence Loss".
Cui, Jiequan +7 more
openaire +2 more sources
ABSTRACT This study examines the environmental, social and governance (ESG) scoring methodologies used by Bloomberg and S&P Global through the lens of Data Envelopment Analysis (DEA). It addresses a notable gap in the literature by identifying the underlying factors that shape ESG scores and providing practical insights for companies seeking to ...
Philipe Balan +4 more
wiley +1 more source
On Weighted Kullback–Leibler Divergence for Doubly Truncated Random Variables
In this communication, we study doubly truncated weighted Kullback–Leibler divergence (KLD) between two nonnegative random variables. The proposed measure is a generalization of the dynamic weighted KLD introduced by Yasaei Sekeh et al. (2013).
Rajesh Moharana , Suchandan Kayal
doaj +1 more source

