Results 81 to 90 of about 4,515 (154)
On Dynamic Generalized Measures of Inaccuracy
Generalized information measures play an important role in the measurement of uncertainty of certain random variables, where the standard practice of applying ordinary uncertainty measures fails to fit.
Suchandan Kayal +2 more
doaj +1 more source
Quantifying Data Dependencies with Rényi Mutual Information and Minimum Spanning Trees
In this study, we present a novel method for quantifying dependencies in multivariate datasets, based on estimating the Rényi mutual information by minimum spanning trees (MSTs). The extent to which random variables are dependent is an important question,
Anne Eggels, Daan Crommelin
doaj +1 more source
Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding.
Wentao Huang, Kechen Zhang
doaj +1 more source
Information criteria for selecting parton distribution function solutions
In data-driven determination of Parton Distribution Functions (PDFs) in global QCD analyses, uncovering the true underlying distributions is complicated by a highly convoluted inverse problem.
Aurore Courtoy, Arturo Ibsen
doaj +1 more source
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or ...
Frank Nielsen
doaj +1 more source
In this work, we estimated the different entropies like Shannon entropy, Renyi divergences, Csiszar divergence by using the Jensen’s type functionals.
Tasadduq Niaz +3 more
doaj +2 more sources
Mechanisms for Robust Local Differential Privacy
We consider privacy mechanisms for releasing data X=(S,U), where S is sensitive and U is non-sensitive. We introduce the robust local differential privacy (RLDP) framework, which provides strong privacy guarantees, while preserving utility.
Milan Lopuhaä-Zwakenberg +1 more
doaj +1 more source
Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators
We provide finite-sample analysis of a general framework for using k-nearest neighbor statistics to estimate functionals of a nonparametric continuous probability density, including entropies and divergences.
Póczos, Barnabás, Singh, Shashank
core
In the two previous papers of this series, the main results on the asymptotic behaviors of empirical divergence measures based on wavelets theory have been established and particularized for important families of divergence measures like Rényi and ...
Amadou Diadié Bâ +2 more
doaj +1 more source
Arimoto Channel Coding Converse and Rényi Divergence [PDF]
—Arimoto [1] proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel.
Sergio Verdú, Yury Polyanskiy
core

