Results 11 to 20 of about 361,322 (325)
Properties of Risk Measures of Generalized Entropy in Portfolio Selection
This paper systematically investigates the properties of six kinds of entropy-based risk measures: Information Entropy and Cumulative Residual Entropy in the probability space, Fuzzy Entropy, Credibility Entropy and Sine Entropy in the fuzzy space, and ...
Rongxi Zhou +3 more
doaj +2 more sources
Empirical Estimation of Information Measures: A Literature Guide
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures.
Sergio Verdú
doaj +2 more sources
Entropy Measures of Probabilistic Linguistic Term Sets
The probabilistic linguistic term sets (PLTSs) are powerful to deal with the hesitant linguistic situation in which each provided linguistic term has a probability.
Hongbin Liu, Le Jiang, Zeshui Xu
doaj +2 more sources
Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem [PDF]
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides
A. Ben-Naim
openaire +3 more sources
On Dynamical Measures of Quantum Information [PDF]
In this work, we use the theory of quantum states over time to define joint entropy for timelike-separated quantum systems. For timelike-separated systems that admit a dual description as being spacelike-separated, our notion of entropy recovers the ...
James Fullwood, Arthur J. Parzygnat
doaj +2 more sources
Quantifying Aleatoric and Epistemic Uncertainty in Machine Learning: Are Conditional Entropy and Mutual Information Appropriate Measures? [PDF]
The quantification of aleatoric and epistemic uncertainty in terms of conditional entropy and mutual information, respectively, has recently become quite common in machine learning.
Eyke Hüllermeier
semanticscholar +1 more source
Identifying influential nodes in complex networks has attracted the attention of many researchers in recent years. However, due to the high time complexity, methods based on global attributes have become unsuitable for large-scale complex networks.
Jinhua Zhang +3 more
semanticscholar +1 more source
Some Information Measures Properties of the GOS-Concomitants from the FGM Family
In this paper we recall, extend and compute some information measures for the concomitants of the generalized order statistics (GOS) from the Farlie–Gumbel–Morgenstern (FGM) family.
Florentina Suter +2 more
doaj +1 more source
Information Entropy as a Reliable Measure of Nanoparticle Dispersity [PDF]
Nanoparticle size impacts properties vital to applications ranging from drug delivery to diagnostics and catalysis. As such, evaluating nanoparticle size dispersity is of fundamental importance. Conventional approaches, such as standard deviation, usually require the nanoparticle population to follow a known distribution and are illequipped to deal ...
Niamh Mac Fhionnlaoich, Stefan Guldin
openaire +4 more sources
Recent developments in quantitative graph theory: information inequalities for networks. [PDF]
In this article, we tackle a challenging problem in quantitative graph theory. We establish relations between graph entropy measures representing the structural information content of networks.
Matthias Dehmer, Lavanya Sivakumar
doaj +1 more source

