Results 31 to 40 of about 4,515 (154)
An amended MaxEnt formulation for deriving Tsallis factors, and associated issues [PDF]
An amended MaxEnt formulation for systems displaced from the conventional MaxEnt equilibrium is proposed. This formulation involves the minimization of the Kullback-Leibler divergence to a reference $Q$ (or maximization of Shannon $Q$-entropy), subject ...
Bercher, Jean-François
core +7 more sources
Rényi divergence variational inference [PDF]
This paper introduces the $\textit{variational Rényi bound}$ (VR) that extends traditional variational inference to Rényi’s $\alpha$-divergences. This new family of variational methods unifies a number of existing approaches, and enables a smooth ...
Li, Y, Turner, RE
core +1 more source
Point Information Gain and Multidimensional Data Analysis
We generalize the Point information gain (PIG) and derived quantities, i.e. Point information entropy (PIE) and Point information entropy density (PIED), for the case of R\'enyi entropy and simulate the behavior of PIG for typical distributions.
Císař, Petr +6 more
core +2 more sources
Direct Estimation of Information Divergence Using Nearest Neighbor Ratios
We propose a direct estimation method for R\'{e}nyi and f-divergence measures based on a new graph theoretical interpretation. Suppose that we are given two sample sets $X$ and $Y$, respectively with $N$ and $M$ samples, where $\eta:=M/N$ is a constant ...
Hero III, Alfred O. +3 more
core +1 more source
Sequence information gain based motif analysis [PDF]
Background: The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved.
Marco, Santiago +3 more
core +3 more sources
Scalable Hash-Based Estimation of Divergence Measures
We propose a scalable divergence estimation method based on hashing. Consider two continuous random variables $X$ and $Y$ whose densities have bounded support.
Hero III, Alfred O., Noshad, Morteza
core +1 more source
Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation
Variational inference provides a way to approximate probability densities through optimization. It does so by optimizing an upper or a lower bound of the likelihood of the observed data (the evidence).
Dana Zalman (Oshri), Shai Fine
doaj +1 more source
Objective Pain is the hallmark symptom of osteoarthritis (OA), and its biologic drivers remain poorly understood. Although the role of innate immunity in OA has been extensively studied, the involvement of adaptive immunity, in particular Treg cells, is not well understood.
Marie Binvignat +26 more
wiley +1 more source
Holographic second laws of black hole thermodynamics
Recently, it has been shown that for out-of-equilibrium systems, there are additional constraints on thermodynamical evolution besides the ordinary second law.
Alice Bernamonti +3 more
doaj +1 more source
In this paper, we discuss on Rényi entropy of k-records arising from any continuous distribution in detail. The relevance of constructing k-records from random sample in the context of information contained in a random variable has been described in the
Jitto Jose , E. I. Abdul Sathar
doaj +1 more source

