Results 81 to 90 of about 4,515 (154)

On Dynamic Generalized Measures of Inaccuracy

open access: yesStatistica, 2017
Generalized information measures play an important role in the measurement of uncertainty of certain random variables, where the standard practice of applying ordinary uncertainty measures fails to fit.
Suchandan Kayal   +2 more
doaj   +1 more source

Quantifying Data Dependencies with Rényi Mutual Information and Minimum Spanning Trees

open access: yesEntropy, 2019
In this study, we present a novel method for quantifying dependencies in multivariate datasets, based on estimating the Rényi mutual information by minimum spanning trees (MSTs). The extent to which random variables are dependent is an important question,
Anne Eggels, Daan Crommelin
doaj   +1 more source

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

open access: yesEntropy, 2019
Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding.
Wentao Huang, Kechen Zhang
doaj   +1 more source

Information criteria for selecting parton distribution function solutions

open access: yesEuropean Physical Journal C: Particles and Fields
In data-driven determination of Parton Distribution Functions (PDFs) in global QCD analyses, uncovering the true underlying distributions is complicated by a highly convoluted inverse problem.
Aurore Courtoy, Arturo Ibsen
doaj   +1 more source

Divergences Induced by the Cumulant and Partition Functions of Exponential Families and Their Deformations Induced by Comparative Convexity

open access: yesEntropy
Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning, among others. An exponential family can either be normalized subtractively by its cumulant or free energy function, or ...
Frank Nielsen
doaj   +1 more source

Estimation of Different Entropies via Taylor One Point and Taylor Two Points Interpolations Using Jensen Type Functionals

open access: yesInternational Journal of Analysis and Applications, 2019
In this work, we estimated the different entropies like Shannon entropy, Renyi divergences, Csiszar divergence by using the Jensen’s type functionals.
Tasadduq Niaz   +3 more
doaj   +2 more sources

Mechanisms for Robust Local Differential Privacy

open access: yesEntropy
We consider privacy mechanisms for releasing data X=(S,U), where S is sensitive and U is non-sensitive. We introduce the robust local differential privacy (RLDP) framework, which provides strong privacy guarantees, while preserving utility.
Milan Lopuhaä-Zwakenberg   +1 more
doaj   +1 more source

Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

open access: yes, 2016
We provide finite-sample analysis of a general framework for using k-nearest neighbor statistics to estimate functionals of a nonparametric continuous probability density, including entropies and divergences.
Póczos, Barnabás, Singh, Shashank
core  

Divergence Measures Estimation and Its Asymptotic Normality Theory Using Wavelets Empirical Processes III

open access: yesJournal of Statistical Theory and Applications (JSTA)
In the two previous papers of this series, the main results on the asymptotic behaviors of empirical divergence measures based on wavelets theory have been established and particularized for important families of divergence measures like Rényi and ...
Amadou Diadié Bâ   +2 more
doaj   +1 more source

Arimoto Channel Coding Converse and Rényi Divergence [PDF]

open access: yes
—Arimoto [1] proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code on a given discrete memoryless channel.
Sergio Verdú, Yury Polyanskiy
core  

Home - About - Disclaimer - Privacy