Adaptive Multiscale Weighted Permutation Entropy for Rolling Bearing Fault Diagnosis [PDF]
© 2020 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.Bearing vibration signals contain non-linear and non-stationary features due to ...
Huo, Zhiqiang +3 more
core +2 more sources
Rényi Entropies of Dynamical Systems: A Generalization Approach [PDF]
Entropy measures have received considerable attention in quantifying the structural complexity of real-world systems and are also used as measures of information obtained from a realization of the considered experiments. In the present study, new notions
Zahra Eslami Giski +2 more
doaj +1 more source
Universality Classes and Information-Theoretic Measures of Complexity via Group Entropies [PDF]
AbstractWe introduce a class of information measures based on group entropies, allowing us to describe the information-theoretical properties of complex systems. These entropic measures are nonadditive, and are mathematically deduced from a series of natural axioms.
Piergiulio Tempesta +1 more
openaire +6 more sources
The Information Loss of a Stochastic Map
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we
James Fullwood, Arthur J. Parzygnat
doaj +1 more source
Informational Measure of Symmetry vs. Voronoi Entropy and Continuous Measure of Entropy of the Penrose Tiling. Part II of the “Voronoi Entropy vs. Continuous Measure of Symmetry of the Penrose Tiling” [PDF]
The notion of the informational measure of symmetry is introduced according to: Hsym(G)=−∑i=1kP(Gi)lnP(Gi), where P(Gi) is the probability of appearance of the symmetry operation Gi within the given 2D pattern. Hsym(G) is interpreted as an averaged uncertainty in the presence of symmetry elements from the group G in the given pattern. The informational
Edward Bormashenko +4 more
openaire +2 more sources
The information shared among observables representing processes of interest is traditionally evaluated in terms of macroscale measures characterizing aggregate properties of the underlying processes and their interactions.
Perdigão, Rui A. P.
core +4 more sources
A step beyond Tsallis and Renyi entropies [PDF]
Tsallis and R\'{e}nyi entropy measures are two possible different generalizations of the Boltzmann-Gibbs entropy (or Shannon's information) but are not generalizations of each others.
Abe +15 more
core +1 more source
Aspects of Holographic Entanglement at Finite Temperature and Chemical Potential [PDF]
We investigate the behavior of entanglement entropy at finite temperature and chemical potential for strongly coupled large-N gauge theories in $d$-dimensions ($d\ge 3$) that are dual to Anti-de Sitter-Reissner-Nordstrom geometries in $(d+1)-$dimensions,
Kundu, Sandipan, Pedraza, Juan F.
core +3 more sources
Measures of entropy and complexity in altered states of consciousness [PDF]
Quantification of complexity in neurophysiological signals has been studied using different methods, especially those from information or dynamical system theory.
D. Mateos +3 more
semanticscholar +1 more source
Tsallis Mutual Information for Document Classification
Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned ...
Màrius Vila +3 more
doaj +1 more source

