Testing Distributional Granger Causality With Entropic Optimal Transport
ABSTRACT We develop a novel nonparametric test for Granger causality in distribution based on entropic optimal transport. Unlike classical mean‐based approaches, the proposed method directly compares the full conditional distributions of a response variable with and without the history of a candidate predictor.
Tao Wang
wiley +1 more source
A Deep-Learning-Based Health Indicator Constructor Using Kullback-Leibler Divergence for Predicting the Remaining Useful Life of Concrete Structures. [PDF]
Nguyen TK, Ahmad Z, Kim JM.
europepmc +1 more source
Learning Based Non-rigid Multi-modal Image Registration Using Kullback-Leibler Divergence [PDF]
Christoph Guetter +3 more
openalex +1 more source
Is A Little Learning Dangerous?
ABSTRACT I argue that a little learning is often dangerous even for ideal reasoners who are operating in extremely simple scenarios and know all the relevant facts about how the evidence is generated. More precisely, I show that, on many plausible ways of assigning value to a credence in a hypothesis H, ideal Bayesians should sometimes expect other ...
Bernhard Salow
wiley +1 more source
Construction of an individualized brain metabolic network in patients with advanced non-small cell lung cancer by the Kullback-Leibler divergence-based similarity method: A study based on 18F-fluorodeoxyglucose positron emission tomography. [PDF]
Yu J +6 more
europepmc +1 more source
Computing the Kullback-Leibler Divergence between two Weibull Distributions [PDF]
Christian Bauckhage
openalex +1 more source
Tracking changes using Kullback-Leibler divergence for the continual learning [PDF]
Sebastián Basterrech, Michał Woźniak
openalex +1 more source
Abstract Recent studies suggest that learners who are asked to predict the outcome of an event learn more than learners who are asked to evaluate it retrospectively or not at all. One possible explanation for this “prediction boost” is that it helps learners engage metacognitive reasoning skills that may not be spontaneously leveraged, especially for ...
Joseph A. Colantonio +4 more
wiley +1 more source
Ratio Divergence Learning Using Target Energy in Restricted Boltzmann Machines: Beyond Kullback--Leibler Divergence Learning [PDF]
Yuichi Ishida +4 more
openalex +1 more source
A Metrological Hyperspectral Texture Similarity Measure using Kullback-Leibler Divergence
Chu, Rui Jian +3 more
openalex +1 more source

