Results 1 to 10 of about 343,367 (193)
A Characterization of Entropy in Terms of Information Loss
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization.
Tom Leinster, John C. Baez, Tobias Fritz
doaj +1 more source
Generalizing Information to the Evolution of Rational Belief
Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility.
Jed A. Duersch, Thomas A. Catanach
doaj +1 more source
Shannon entropy and complexity measures for Bohr Hamiltonian with triaxial nuclei
The Shannon entropy, disequilibrium, and complexity measures for n=0,1 in position and momentum space are investigated within the framework of Bohr Hamiltonian for triaxial nuclei with Davidson potential. The effect of the angular momentum quantum number
P.O. Amadi +8 more
doaj +1 more source
We introduce notions of information/entropy and information loss associated to exponentiable motivic measures. We show that they satisfy appropriate analogs to the Khinchin-type properties that characterize information loss in the context of measures on ...
Marcolli, Matilde
core +1 more source
Relations Between Conditional Shannon Entropy and Expectation of $\ell_{\alpha}$-Norm
The paper examines relationships between the conditional Shannon entropy and the expectation of $\ell_{\alpha}$-norm for joint probability distributions.
Iwata, Ken-ichi, Sakai, Yuta
core +1 more source
JIDT: An information-theoretic toolkit for studying the dynamics of complex systems [PDF]
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon)
Lizier, Joseph T.
core +2 more sources
A modified belief entropy in Dempster-Shafer framework. [PDF]
How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass
Deyun Zhou, Yongchuan Tang, Wen Jiang
doaj +1 more source
Quantifying Information Leakage in Finite Order Deterministic Programs [PDF]
Information flow analysis is a powerful technique for reasoning about the sensitive information exposed by a program during its execution. While past work has proposed information theoretic metrics (e.g., Shannon entropy, min-entropy, guessing entropy ...
Srivatsa, Mudhakar, Zhu, Ji
core
On Convergence Properties of Shannon Entropy
Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies.
A. Antos +16 more
core +1 more source
Does Amount of Information Support Aesthetic Values?
Obtaining information from the world is important for survival. The brain, therefore, has special mechanisms to extract as much information as possible from sensory stimuli.
Norberto M. Grzywacz +3 more
doaj +1 more source

