Results 21 to 30 of about 105,087 (310)
Estimating the variance of Shannon entropy [PDF]
The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation of information theoretical quantities, most notably Shannon entropy. To this purpose, possibly the most widespread tool is provided by the so-called plug-in estimator, whose statistical properties in terms of ...
Ricci, Leonardo +2 more
openaire +3 more sources
Analyzing boron oxide networks through Shannon entropy and Pearson correlation coefficient. [PDF]
Huang R +4 more
europepmc +3 more sources
Shannon entropy: axiomatic characterization and application [PDF]
We have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. We have then modified Shannon entropy to take account of observational uncertainty.The modified entropy reduces, in the limiting case, to the form of Shannon ...
C. G. Chakrabarti, Indranil Chakrabarty
doaj +3 more sources
Adaptive estimation of Shannon entropy [PDF]
We consider estimating the Shannon entropy of a discrete distribution $P$ from $n$ i.i.d. samples. Recently, Jiao, Venkat, Han, and Weissman, and Wu and Yang constructed approximation theoretic estimators that achieve the minimax $L_2$ rates in estimating entropy.
Han, Yanjun +2 more
openaire +2 more sources
On Shannon entropy and its applications
A brief and intuitive introduction to Shannon entropy is presented, including some of its properties. The application of this measure is exemplified in two different contexts from what was in its genesis: biological diversity and an original study on ...
Paulo Saraiva
doaj +1 more source
On the Characterization of Shannon’s Entropy by Shannon’s Inequality [PDF]
1. In [2,5,6,7] a.o. several interpretations of the inequalityfor allsuch thatwere given and the following was proved.
Aczél, J., Ostrowski, A. M.
openaire +2 more sources
Shannon Entropy Estimation for Linear Processes [PDF]
In this paper, we estimate the Shannon entropy S(f)=−E[log(f(x))] of a one-sided linear process with probability density function f(x). We employ the integral estimator Sn(f), which utilizes the standard kernel density estimator fn(x) of f(x). We show that Sn(f) converges to S(f) almost surely and in Ł2 under reasonable conditions.
Timothy Fortune, Hailin Sang
openaire +3 more sources
Shannon Entropy Reinterpreted [PDF]
In this paper we remark that Shannon entropy can be expressed as a function of the self-information (i.e. the logarithm) and the inverse of the Lambert $W$ function. It means that we consider that Shannon entropy has the trace form: $-k \sum_{i} W^{-1} \circ \mathsf{ln}(p_{i})$.
openaire +2 more sources
Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location
In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection.
Qiaoning Yang, Jianlin Wang
doaj +1 more source

