Results 21 to 30 of about 548 (63)

Entropy and a generalisation of `Poincare's Observation'

open access: yes, 2002
Consider a sphere of radius root(n) in n dimensions, and consider X, a random variable uniformly distributed on its surface. Poincare's Observation states that for large n, the distribution of the first k coordinates of X is close in total variation ...
Johnson, Oliver
core   +1 more source

Optimal Concentration of Information Content For Log-Concave Densities

open access: yes, 2015
An elementary proof is provided of sharp bounds for the varentropy of random vectors with log-concave densities, as well as for deviations of the information content from its mean.
A. Prékopa   +18 more
core   +1 more source

An entropic uncertainty principle for positive operator valued measures

open access: yes, 2011
Extending a recent result by Frank and Lieb, we show an entropic uncertainty principle for mixed states in a Hilbert space relatively to pairs of positive operator valued measures that are independent in some sense.
D. Petz   +9 more
core   +1 more source

An alternate measure of the cumulative residual Sharma-Taneja-Mittal entropy

open access: yesAnalele Stiintifice ale Universitatii Ovidius Constanta: Seria Matematica
We define a new alternate measure of the cumulative residual Sharma-Taneja-Mittal entropy. For this measure, there are given upper and lower bounds, is introduced a consistent test based on the uniform distribution and some concrete numerical examples ...
Sfetcu Răzvan-Cornel   +2 more
doaj   +1 more source

Complexities of information sources. [PDF]

open access: yesJ Appl Stat, 2023
Sayyari Y, Molaei MR, Mehrpooya A.
europepmc   +1 more source

Logical Entropy of Information Sources. [PDF]

open access: yesEntropy (Basel), 2022
Xu P, Sayyari Y, Butt SI.
europepmc   +1 more source

Home - About - Disclaimer - Privacy