Results 71 to 80 of about 343,367 (193)
Properties of Entropy-Based Topological Measures of Fullerenes
A fullerene is a cubic three-connected graph whose faces are entirely composed of pentagons and hexagons. Entropy applied to graphs is one of the significant approaches to measuring the complexity of relational structures.
Modjtaba Ghorbani +2 more
doaj +1 more source
On directed information theory and Granger causality graphs
Directed information theory deals with communication channels with feedback. When applied to networks, a natural extension based on causal conditioning is needed.
A Kaiser +44 more
core +3 more sources
Is the Voronoi Entropy a True Entropy? Comments on “Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem”, Entropy 2017, 19, 48 [PDF]
The goal of this comment note is to express our considerations about the recent paper by A. Ben Naim (Entropy 2017, 19, 48). We strongly support the distinguishing between the Shannon measure of information and the thermodynamic entropy, suggested in the paper.
Edward Bormashenko +2 more
openaire +3 more sources
Comparing Security Notions of Secret Sharing Schemes
Different security notions of secret sharing schemes have been proposed by different information measures. Entropies, such as Shannon entropy and min entropy, are frequently used in the setting security notions for secret sharing schemes.
Songsong Dai, Donghui Guo
doaj +1 more source
Network information and connected correlations
Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible $N$--point correlation is measured by a decrease in entropy ...
A. Soofi +18 more
core +1 more source
On Relations Between the Relative entropy and $\chi^2$-Divergence, Generalizations and Applications
The relative entropy and chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their ...
Nishiyama, Tomohiro, Sason, Igal
core +1 more source
The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes.
Crutchfield, James P. +2 more
core
First Principles Calculation of the Entropy of Liquid Aluminum
The information required to specify a liquid structure equals, in suitable units, its thermodynamic entropy. Hence, an expansion of the entropy in terms of multi-particle correlation functions can be interpreted as a hierarchy of information measures ...
Michael Widom, Michael Gao
doaj +1 more source
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem [PDF]
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides
openaire +2 more sources
The distribution of information for sEMG signals in the rectal cancer treatment process
The electrical activity of external anal sphincter can be registered with surface electromyography. This signals are known to be highly complex and nonlinear.
Machura, Lukasz +3 more
core +1 more source

