Results 31 to 40 of about 528,328 (297)

Relative entropy, Haar measures and relativistic canonical velocity distributions [PDF]

open access: yes, 2007
The thermodynamic maximum principle for the Boltzmann-Gibbs-Shannon (BGS) entropy is reconsidered by combining elements from group and measure theory. Our analysis starts by noting that the BGS entropy is a special case of relative entropy.
Banavar J Maritan A   +17 more
core   +2 more sources

EDAS method for multiple attribute group decision making with probabilistic dual hesitant fuzzy information and its application to suppliers selection

open access: yesTechnological and Economic Development of Economy, 2023
Probabilistic dual hesitant fuzzy set (PDHFS) is a more powerful and important tool to describe uncertain information regarded as generalization of hesitant fuzzy set (HFS) and dual HFS (DHFS), not only reflects the hesitant attitude of decision-makers (
Baoquan Ning   +3 more
doaj   +1 more source

Quantum Information and Entropy [PDF]

open access: yes, 2006
Thermodynamic entropy is not an entirely satisfactory measure of information of a quantum state. This entropy for an unknown pure state is zero, although repeated measurements on copies of such a pure state do communicate information. In view of this, we
A. C. Elitzur   +5 more
core   +3 more sources

Measurement Invariance, Entropy, and Probability [PDF]

open access: yesEntropy, 2010
We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint.
Frank, Steven A, Smith, D. Eric
openaire   +6 more sources

Entropy and the variational principle for actions of sofic groups [PDF]

open access: yes, 2010
Recently Lewis Bowen introduced a notion of entropy for measure-preserving actions of a countable sofic group on a standard probability space admitting a generating partition with finite entropy.
A.N. Kolmogorov   +15 more
core   +3 more sources

A Dual Measure of Uncertainty: The Deng Extropy [PDF]

open access: yes, 2020
The extropy has recently been introduced as the dual concept of entropy. Moreover, in the context of the Dempster–Shafer evidence theory, Deng studied a new measure of discrimination, named the Deng entropy.
Buono, Francesco, Longobardi, Maria
core   +2 more sources

Entropy measures vs. algorithmic information [PDF]

open access: yes2010 IEEE International Symposium on Information Theory, 2010
Algorithmic entropy and Shannon entropy are two conceptually different information measures, as the former is based on size of programs and the later in probability distributions. However, it is known that, for any recursive probability distribution, the expected value of algorithmic entropy equals its Shannon entropy, up to a constant that depends ...
Teixeira, Andreia   +3 more
openaire   +2 more sources

New Entropy-Based Similarity Measure between Interval-Valued Intuitionstic Fuzzy Sets

open access: yesAxioms, 2019
In this paper, we propose a new approach to constructing similarity measures using the entropy measure for Interval-Valued Intuitionistic Fuzzy Sets. In addition, we provide several illustrative examples to demonstrate the practicality and effectiveness ...
Saida S. Mohamed   +2 more
doaj   +1 more source

Quantum measurement and entropy production [PDF]

open access: yesPhysics Letters A, 2001
5 pages, 2 ...
Grigolini P, Pala M, Palatella L
openaire   +3 more sources

Approximate entropy of network parameters [PDF]

open access: yes, 2011
We study the notion of approximate entropy within the framework of network theory. Approximate entropy is an uncertainty measure originally proposed in the context of dynamical systems and time series.
Lacasa, Lucas   +3 more
core   +3 more sources

Home - About - Disclaimer - Privacy