Results 31 to 40 of about 105,427 (308)
Analysis of Meditation vs. Sensory Engaged Brain States Using Shannon Entropy and Pearson's First Skewness Coefficient Extracted from EEG Data. [PDF]
Davis JJJ, Kozma R, Schübeler F.
europepmc +3 more sources
Quantifying Information via Shannon Entropy in Spatially Structured Optical Beams. [PDF]
Solyanik-Gorgone M +5 more
europepmc +3 more sources
Few generalized entropic relations related to Rydberg atoms
We calculate the analytical and numerical values of the position space Shannon entropy, momentum space Shannon entropy, and total Shannon entropy, $$S_\rho$$ S ρ , $$S_\gamma$$ S γ , and $$S_T$$ S T , respectively, of free and trapped Rydberg hydrogen ...
Kirtee Kumar, Vinod Prasad
doaj +1 more source
Estimating the variance of Shannon entropy [PDF]
The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation of information theoretical quantities, most notably Shannon entropy. To this purpose, possibly the most widespread tool is provided by the so-called plug-in estimator, whose statistical properties in terms of ...
Ricci, Leonardo +2 more
openaire +3 more sources
Analyzing boron oxide networks through Shannon entropy and Pearson correlation coefficient. [PDF]
Huang R +4 more
europepmc +3 more sources
Adaptive estimation of Shannon entropy [PDF]
We consider estimating the Shannon entropy of a discrete distribution $P$ from $n$ i.i.d. samples. Recently, Jiao, Venkat, Han, and Weissman, and Wu and Yang constructed approximation theoretic estimators that achieve the minimax $L_2$ rates in estimating entropy.
Han, Yanjun +2 more
openaire +2 more sources
On the Characterization of Shannon’s Entropy by Shannon’s Inequality [PDF]
1. In [2,5,6,7] a.o. several interpretations of the inequalityfor allsuch thatwere given and the following was proved.
Aczél, J., Ostrowski, A. M.
openaire +2 more sources
Shannon Entropy Estimation for Linear Processes [PDF]
In this paper, we estimate the Shannon entropy S(f)=−E[log(f(x))] of a one-sided linear process with probability density function f(x). We employ the integral estimator Sn(f), which utilizes the standard kernel density estimator fn(x) of f(x). We show that Sn(f) converges to S(f) almost surely and in Ł2 under reasonable conditions.
Timothy Fortune, Hailin Sang
openaire +3 more sources
Shannon Entropy Reinterpreted [PDF]
In this paper we remark that Shannon entropy can be expressed as a function of the self-information (i.e. the logarithm) and the inverse of the Lambert $W$ function. It means that we consider that Shannon entropy has the trace form: $-k \sum_{i} W^{-1} \circ \mathsf{ln}(p_{i})$.
openaire +2 more sources

