First Digits’ Shannon Entropy [PDF]
Related to the letters of an alphabet, entropy means the average number of binary digits required for the transmission of one character. Checking tables of statistical data, one finds that, in the first position of the numbers, the digits 1 to 9 occur ...
Welf Alfred Kreiner
doaj +4 more sources
Shannon entropy and particle decays [PDF]
We deploy Shannon's information entropy to the distribution of branching fractions in a particle decay. This serves to quantify how important a given new reported decay channel is, from the point of view of the information that it adds to the already ...
Pedro Carrasco Millán +4 more
doaj +8 more sources
Perceptual Complexity as Normalized Shannon Entropy [PDF]
Complexity is one of the most important variables in how the brain performs decision making based on esthetic values. Multiple definitions of perceptual complexity have been proposed, with one of the most fruitful being the Normalized Shannon Entropy one.
Norberto M. Grzywacz
doaj +4 more sources
Spatial distribution of the Shannon entropy for mass spectrometry imaging [PDF]
Mass spectrometry imaging (MSI) allows us to visualize the spatial distribution of molecular components in a sample. A large amount of mass spectrometry data comprehensively provides molecular distributions.
Lili Xu +13 more
doaj +3 more sources
Shannon Entropy Loss in Mixed-Radix Conversions [PDF]
This paper models a translation for base-2 pseudorandom number generators (PRNGs) to mixed-radix uses such as card shuffling. In particular, we explore a shuffler algorithm that relies on a sequence of uniformly distributed random inputs from a mixed ...
Amy Vennos, Alan Michaels
doaj +2 more sources
Shannon Entropy in LS-Coupled Configuration Space for Ni-like Isoelectronic Sequence [PDF]
The Shannon entropy in an LS-coupled configuration space has been calculated through a transformation from that in a jj-coupled configuration space for a Ni-like isoelectronic sequence.
Jian-Jie Wan, Jie Gu, Jiao Li, Na Guo
doaj +2 more sources
From Chaos to Ordering: New Studies in the Shannon Entropy of 2D Patterns [PDF]
Properties of the Voronoi tessellations arising from random 2D distribution points are reported. We applied an iterative procedure to the Voronoi diagrams generated by a set of points randomly placed on the plane. The procedure implied dividing the edges
Irina Legchenkova +5 more
doaj +2 more sources
Application of Positional Entropy to Fast Shannon Entropy Estimation for Samples of Digital Signals [PDF]
This paper introduces a new method of estimating Shannon entropy. The proposed method can be successfully used for large data samples and enables fast computations to rank the data samples according to their Shannon entropy.
Marcin Cholewa, Bartłomiej Płaczek
doaj +2 more sources
Harnessing Shannon entropy-based descriptors in machine learning models to enhance the prediction accuracy of molecular properties [PDF]
Accurate prediction of molecular properties is essential in the screening and development of drug molecules and other functional materials. Traditionally, property-specific molecular descriptors are used in machine learning models.
Rajarshi Guha, Darrell Velegol
doaj +2 more sources
Maximal Shannon entropy in the vicinity of an exceptional point in an open microcavity [PDF]
The Shannon entropy as a measure of information contents is investigated around an exceptional point (EP) in an open elliptical microcavity as a non-Hermitian system. The Shannon entropy is maximized near the EP in the parameter space for two interacting
Kyu-Won Park +3 more
doaj +2 more sources

