Results 31 to 40 of about 1,440,604 (97)
Pretty good measures in quantum information theory [PDF]
Quantum generalizations of Rényi's entropies are a useful tool to describe a variety of operational tasks in quantum information processing. Two families of such generalizations turn out to be particularly useful: the Petz quantum Rényi divergence Dα and
Raban Iten, J. Renes, David Sutter
semanticscholar +1 more source
A Geometric View on Constrained M-Estimators [PDF]
We study the estimation error of constrained M-estimators, and derive explicit upper bounds on the expected estimation error determined by the Gaussian width of the constraint set.
Cevher, Volkan+3 more
core +1 more source
The Third-Order Term in the Normal Approximation for the AWGN Channel
This paper shows that, under the average error probability formalism, the third-order term in the normal approximation for the additive white Gaussian noise channel with a maximal or equal power constraint is at least $\frac{1}{2} \log n + O(1)$.
Tan, Vincent Y. F., Tomamichel, Marco
core +1 more source
Euclidean and Hermitian LCD MDS codes
Linear codes with complementary duals (abbreviated LCD) are linear codes whose intersection with their dual is trivial. When they are binary, they play an important role in armoring implementations against side-channel attacks and fault injection attacks.
Carlet, Claude+3 more
core +2 more sources
Compressed Sensing Performance Analysis via Replica Method using Bayesian framework [PDF]
Compressive sensing (CS) is a new methodology to capture signals at lower rate than the Nyquist sampling rate when the signals are sparse or sparse in some domain.
Barzideh, Faraz+2 more
core
Scampi: a robust approximate message-passing framework for compressive imaging
Reconstruction of images from noisy linear measurements is a core problem in image processing, for which convex optimization methods based on total variation (TV) minimization have been the long-standing state-of-the-art.
Barbier, Jean+2 more
core +1 more source
On Exponentially Concave Functions and Their Impact in Information Theory
Concave functions play a central role in optimization. So-called exponentially concave functions are of similar importance in information theory. In this paper, we comprehensively discuss mathematical properties of the class of exponentially concave ...
G. Alirezaei, R. Mathar
semanticscholar +1 more source
An Information Identity for State-dependent Channels with Feedback [PDF]
In this technical note, we investigate information quantities of state-dependent communication channels with corrupted information fed back from the receiver. We derive an information identity which can be interpreted as a law of conservation of information flows.
arxiv
Understanding Shannon's Entropy metric for Information [PDF]
Shannon's metric of "Entropy" of information is a foundational concept of information theory. This article is a primer for novices that presents an intuitive way of understanding, remembering, and/or reconstructing Shannon's Entropy metric for information.
arxiv