Results 41 to 50 of about 361,322 (325)
Alternative Entropy Measures and Generalized Khinchin–Shannon Inequalities
The Khinchin–Shannon generalized inequalities for entropy measures in Information Theory, are a paradigm which can be used to test the Synergy of the distributions of probabilities of occurrence in physical systems.
Rubem P. Mondaini +1 more
doaj +1 more source
The necessity and policy of eco-economy stimulate enterprises to attain sustainability by executing supply chain management. Generally, the evaluation process of sustainable recycling partner (SRP) selection is treated as a multi-criteria decision-making
A. Mishra, Pratibha Rani
semanticscholar +1 more source
Information Distances versus Entropy Metric
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years.
Bo Hu, Lvqing Bi, Songsong Dai
doaj +1 more source
On using Shannon entropy measure for formulating new weighted exponential distribution
In this article, a new class of weighted exponential distribution called Entropy-Based Exponential weighted distribution (EBEWD) is proposed. The main idea of the new lifetime distribution is in using the Shannon entropy measure as a weighted function ...
Amjad D. Al-Nasser +2 more
doaj +1 more source
JIDT: An information-theoretic toolkit for studying the dynamics of complex systems [PDF]
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon)
Lizier, Joseph T.
core +2 more sources
Entropy, mutual information, and systematic measures of structured spiking neural networks [PDF]
The aim of this paper is to investigate various information-theoretic measures, including entropy, mutual information, and some systematic measures that based on mutual information, for a class of structured spiking neuronal network. In order to analyze and compute these information-theoretic measures for large networks, we coarse-grained the data by ...
Wenjie Li, Yao Li
openaire +3 more sources
Information and Divergence Measures
The present Special Issue of Entropy, entitled Information and Divergence Measures, covers various aspects and applications in the general area of Information and Divergence Measures [...]
Alex Karagrigoriou, Andreas Makrides
doaj +1 more source
Entropy Measures for Probabilistic Hesitant Fuzzy Information
The probabilistic hesitant fuzzy set (PHFS), which is remarkable in describing the practical condition, has attracted great attention and been applied to many areas.
Zhan Su +4 more
semanticscholar +1 more source
New Information Measures for the Generalized Normal Distribution
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distribution family, to study the generalized entropy type measures of information. For this generalized normal, the Kullback-Leibler information is evaluated,
Christos P. Kitsos, Thomas L. Toulias
doaj +1 more source
Smooth Renyi Entropies and the Quantum Information Spectrum
Many of the traditional results in information theory, such as the channel coding theorem or the source coding theorem, are restricted to scenarios where the underlying resources are independent and identically distributed (i.i.d.) over a large number of
Datta, Nilanjana, Renner, Renato
core +1 more source

