Results 11 to 20 of about 105,427 (308)

First Digits’ Shannon Entropy [PDF]

open access: yesEntropy, 2022
Related to the letters of an alphabet, entropy means the average number of binary digits required for the transmission of one character. Checking tables of statistical data, one finds that, in the first position of the numbers, the digits 1 to 9 occur ...
Welf Alfred Kreiner
doaj   +4 more sources

Shannon entropy and particle decays [PDF]

open access: yesNuclear Physics B, 2018
We deploy Shannon's information entropy to the distribution of branching fractions in a particle decay. This serves to quantify how important a given new reported decay channel is, from the point of view of the information that it adds to the already ...
Pedro Carrasco Millán   +4 more
doaj   +8 more sources

Perceptual Complexity as Normalized Shannon Entropy [PDF]

open access: yesEntropy
Complexity is one of the most important variables in how the brain performs decision making based on esthetic values. Multiple definitions of perceptual complexity have been proposed, with one of the most fruitful being the Normalized Shannon Entropy one.
Norberto M. Grzywacz
doaj   +4 more sources

Infinite Shannon entropy [PDF]

open access: yesJournal of Statistical Mechanics: Theory and Experiment, 2013
Even if a probability distribution is properly normalizable, its associated Shannon (or von Neumann) entropy can easily be infinite. We carefully analyze conditions under which this phenomenon can occur.
Baccetti, Valentina, Visser, Matt
core   +2 more sources

Statistical estimation of conditional Shannon entropy [PDF]

open access: greenESAIM: Probability and Statistics, 2018
The new estimates of the conditional Shannon entropy are introduced in the framework of the model describing a discrete response variable depending on a vector ofdfactors having a density w.r.t. the Lebesgue measure in ℝd. Namely, the mixed-pair model (X,Y) is considered whereXandYtake values in ℝdand an arbitrary finite set, respectively.
Alexander Bulinski, Alexey Kozhevin
openalex   +5 more sources

On Convergence Properties of Shannon Entropy [PDF]

open access: yesProblems of Information Transmission, 2007
Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies.
A. Antos   +16 more
core   +2 more sources

Application of Positional Entropy to Fast Shannon Entropy Estimation for Samples of Digital Signals [PDF]

open access: goldEntropy, 2020
This paper introduces a new method of estimating Shannon entropy. The proposed method can be successfully used for large data samples and enables fast computations to rank the data samples according to their Shannon entropy.
Marcin Cholewa, Bartłomiej Płaczek
doaj   +2 more sources

Shannon Entropy Loss in Mixed-Radix Conversions [PDF]

open access: yesEntropy, 2021
This paper models a translation for base-2 pseudorandom number generators (PRNGs) to mixed-radix uses such as card shuffling. In particular, we explore a shuffler algorithm that relies on a sequence of uniformly distributed random inputs from a mixed ...
Amy Vennos, Alan Michaels
doaj   +2 more sources

Renyi extrapolation of Shannon entropy [PDF]

open access: greenOpen Systems & Information Dynamics, 2003
Relations between Shannon entropy and Rényi entropies of integer order are discussed. For any N-point discrete probability distribution for which the Rényi entropies of order two and three are known, we provide a lower and an upper bound for the Shannon entropy. The average of both bounds provide an explicit extrapolation for this quantity.
Karol Życzkowski
openalex   +5 more sources

Spatial distribution of the Shannon entropy for mass spectrometry imaging [PDF]

open access: yesPLoS ONE, 2023
Mass spectrometry imaging (MSI) allows us to visualize the spatial distribution of molecular components in a sample. A large amount of mass spectrometry data comprehensively provides molecular distributions.
Lili Xu   +13 more
doaj   +3 more sources

Home - About - Disclaimer - Privacy