Results 31 to 40 of about 105,427 (308)

Quantifying Information via Shannon Entropy in Spatially Structured Optical Beams. [PDF]

open access: goldResearch (Wash D C), 2021
Solyanik-Gorgone M   +5 more
europepmc   +3 more sources

Few generalized entropic relations related to Rydberg atoms

open access: yesScientific Reports, 2022
We calculate the analytical and numerical values of the position space Shannon entropy, momentum space Shannon entropy, and total Shannon entropy, $$S_\rho$$ S ρ , $$S_\gamma$$ S γ , and $$S_T$$ S T , respectively, of free and trapped Rydberg hydrogen ...
Kirtee Kumar, Vinod Prasad
doaj   +1 more source

Estimating the variance of Shannon entropy [PDF]

open access: yesPhysical Review E, 2021
The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation of information theoretical quantities, most notably Shannon entropy. To this purpose, possibly the most widespread tool is provided by the so-called plug-in estimator, whose statistical properties in terms of ...
Ricci, Leonardo   +2 more
openaire   +3 more sources

Adaptive estimation of Shannon entropy [PDF]

open access: yes2015 IEEE International Symposium on Information Theory (ISIT), 2015
We consider estimating the Shannon entropy of a discrete distribution $P$ from $n$ i.i.d. samples. Recently, Jiao, Venkat, Han, and Weissman, and Wu and Yang constructed approximation theoretic estimators that achieve the minimax $L_2$ rates in estimating entropy.
Han, Yanjun   +2 more
openaire   +2 more sources

On the Characterization of Shannon’s Entropy by Shannon’s Inequality [PDF]

open access: yesJournal of the Australian Mathematical Society, 1973
1. In [2,5,6,7] a.o. several interpretations of the inequalityfor allsuch thatwere given and the following was proved.
Aczél, J., Ostrowski, A. M.
openaire   +2 more sources

Shannon Entropy Estimation for Linear Processes [PDF]

open access: yesJournal of Risk and Financial Management, 2020
In this paper, we estimate the Shannon entropy S(f)=−E[log(f(x))] of a one-sided linear process with probability density function f(x). We employ the integral estimator Sn(f), which utilizes the standard kernel density estimator fn(x) of f(x). We show that Sn(f) converges to S(f) almost surely and in Ł2 under reasonable conditions.
Timothy Fortune, Hailin Sang
openaire   +3 more sources

Shannon Entropy Reinterpreted [PDF]

open access: yesReports on Mathematical Physics, 2018
In this paper we remark that Shannon entropy can be expressed as a function of the self-information (i.e. the logarithm) and the inverse of the Lambert $W$ function. It means that we consider that Shannon entropy has the trace form: $-k \sum_{i} W^{-1} \circ \mathsf{ln}(p_{i})$.
openaire   +2 more sources

Home - About - Disclaimer - Privacy