Rényi Entropies of Dynamical Systems: A Generalization Approach [PDF]
Entropy measures have received considerable attention in quantifying the structural complexity of real-world systems and are also used as measures of information obtained from a realization of the considered experiments. In the present study, new notions
Zahra Eslami Giski +2 more
doaj +1 more source
Properties of Risk Measures of Generalized Entropy in Portfolio Selection
This paper systematically investigates the properties of six kinds of entropy-based risk measures: Information Entropy and Cumulative Residual Entropy in the probability space, Fuzzy Entropy, Credibility Entropy and Sine Entropy in the fuzzy space, and ...
Rongxi Zhou +3 more
doaj +1 more source
Universality Classes and Information-Theoretic Measures of Complexity via Group Entropies [PDF]
AbstractWe introduce a class of information measures based on group entropies, allowing us to describe the information-theoretical properties of complex systems. These entropic measures are nonadditive, and are mathematically deduced from a series of natural axioms.
Piergiulio Tempesta +1 more
openaire +6 more sources
The Information Loss of a Stochastic Map
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we
James Fullwood, Arthur J. Parzygnat
doaj +1 more source
Informational Measure of Symmetry vs. Voronoi Entropy and Continuous Measure of Entropy of the Penrose Tiling. Part II of the “Voronoi Entropy vs. Continuous Measure of Symmetry of the Penrose Tiling” [PDF]
The notion of the informational measure of symmetry is introduced according to: Hsym(G)=−∑i=1kP(Gi)lnP(Gi), where P(Gi) is the probability of appearance of the symmetry operation Gi within the given 2D pattern. Hsym(G) is interpreted as an averaged uncertainty in the presence of symmetry elements from the group G in the given pattern. The informational
Edward Bormashenko +4 more
openaire +2 more sources
Adaptive Multiscale Weighted Permutation Entropy for Rolling Bearing Fault Diagnosis [PDF]
© 2020 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Bearing vibration signals contain non-linear and non-stationary features due to ...
Huo, Zhiqiang +3 more
core +2 more sources
Entropy Measures of Probabilistic Linguistic Term Sets
The probabilistic linguistic term sets (PLTSs) are powerful to deal with the hesitant linguistic situation in which each provided linguistic term has a probability.
Hongbin Liu, Le Jiang, Zeshui Xu
doaj +1 more source
Alternative Entropy Measures and Generalized Khinchin–Shannon Inequalities
The Khinchin–Shannon generalized inequalities for entropy measures in Information Theory, are a paradigm which can be used to test the Synergy of the distributions of probabilities of occurrence in physical systems.
Rubem P. Mondaini +1 more
doaj +1 more source
A step beyond Tsallis and Renyi entropies [PDF]
Tsallis and R\'{e}nyi entropy measures are two possible different generalizations of the Boltzmann-Gibbs entropy (or Shannon's information) but are not generalizations of each others.
Abe +15 more
core +1 more source
Tsallis Mutual Information for Document Classification
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned ...
Màrius Vila +3 more
doaj +1 more source

