Results 41 to 50 of about 1,440,604 (97)
Conditionally independent random variables [PDF]
In this paper we investigate the notion of conditional independence and prove several information inequalities for conditionally independent random variables.
arxiv
Information Theory and the IrisCode
Iris recognition has legendary resistance to false matches, and the tools of information theory can help to explain why. The concept of entropy is fundamental to understanding biometric collision avoidance.
J. Daugman
semanticscholar +1 more source
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities [PDF]
This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength.
Tan, Vincent Y. F.
core
Wiretap Channel With Side Information [PDF]
This submission has been withdrawn by the author.
arxiv
"Infographics" team: Selecting Control Parameters via Maximal Fisher Information [PDF]
Team description paper for RoboCup 2014 Soccer Simulation League 2D.
arxiv
Two theorems on distribution of Gaussian quadratic forms [PDF]
New results on comparison of distributions of Gaussian quadratic forms are ...
arxiv
Mutual information is copula entropy [PDF]
We prove that mutual information is actually negative copula entropy, based on which a method for mutual information estimation is proposed.
arxiv
The main scope of this chapter is metrics defined for coding and decoding purposes, mainly for block codes.
arxiv
"Compressed" Compressed Sensing
The field of compressed sensing has shown that a sparse but otherwise arbitrary vector can be recovered exactly from a small number of randomly constructed linear projections (or samples).
Gastpar, Michael, Reeves, Galen
core
Compressing Probability Distributions [PDF]
We show how to store good approximations of probability distributions in small space.
arxiv