Results 1 to 10 of about 343,367 (193)

A Characterization of Entropy in Terms of Information Loss

open access: yesEntropy, 2011
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization.
Tom Leinster, John C. Baez, Tobias Fritz
doaj   +1 more source

Generalizing Information to the Evolution of Rational Belief

open access: yesEntropy, 2020
Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility.
Jed A. Duersch, Thomas A. Catanach
doaj   +1 more source

Shannon entropy and complexity measures for Bohr Hamiltonian with triaxial nuclei

open access: yesResults in Physics, 2022
The Shannon entropy, disequilibrium, and complexity measures for n=0,1 in position and momentum space are investigated within the framework of Bohr Hamiltonian for triaxial nuclei with Davidson potential. The effect of the angular momentum quantum number
P.O. Amadi   +8 more
doaj   +1 more source

Motivic Information [PDF]

open access: yes, 2017
We introduce notions of information/entropy and information loss associated to exponentiable motivic measures. We show that they satisfy appropriate analogs to the Khinchin-type properties that characterize information loss in the context of measures on ...
Marcolli, Matilde
core   +1 more source

Relations Between Conditional Shannon Entropy and Expectation of $\ell_{\alpha}$-Norm

open access: yes, 2016
The paper examines relationships between the conditional Shannon entropy and the expectation of $\ell_{\alpha}$-norm for joint probability distributions.
Iwata, Ken-ichi, Sakai, Yuta
core   +1 more source

JIDT: An information-theoretic toolkit for studying the dynamics of complex systems [PDF]

open access: yes, 2014
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon)
Lizier, Joseph T.
core   +2 more sources

A modified belief entropy in Dempster-Shafer framework. [PDF]

open access: yesPLoS ONE, 2017
How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass
Deyun Zhou, Yongchuan Tang, Wen Jiang
doaj   +1 more source

Quantifying Information Leakage in Finite Order Deterministic Programs [PDF]

open access: yes, 2010
Information flow analysis is a powerful technique for reasoning about the sensitive information exposed by a program during its execution. While past work has proposed information theoretic metrics (e.g., Shannon entropy, min-entropy, guessing entropy ...
Srivatsa, Mudhakar, Zhu, Ji
core  

On Convergence Properties of Shannon Entropy

open access: yes, 2007
Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies.
A. Antos   +16 more
core   +1 more source

Does Amount of Information Support Aesthetic Values?

open access: yesFrontiers in Neuroscience, 2022
Obtaining information from the world is important for survival. The brain, therefore, has special mechanisms to extract as much information as possible from sensory stimuli.
Norberto M. Grzywacz   +3 more
doaj   +1 more source

Home - About - Disclaimer - Privacy