Results 41 to 50 of about 7,498,456 (339)
Entropy measures vs. algorithmic information [PDF]
Algorithmic entropy and Shannon entropy are two conceptually different information measures, as the former is based on size of programs and the later in probability distributions. However, it is known that, for any recursive probability distribution, the expected value of algorithmic entropy equals its Shannon entropy, up to a constant that depends ...
Teixeira, Andreia +3 more
openaire +2 more sources
Quantum measurement and entropy production [PDF]
5 pages, 2 ...
Grigolini P, Pala M, Palatella L
openaire +3 more sources
A New Development of Entropy and Similarity Measures in Temporal Complex Neutrosophic Environments for Tourist Destination Selection [PDF]
In human existence, making decisions is a common event. Various techniques have been devised to tackle decision-making troubles in practical situations.
Florentin Smarandache +3 more
doaj +1 more source
The linguistic Pythagorean fuzzy set (LPFS) is an important implement for modeling the uncertain and imprecise information. In this paper, a novel TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) method is proposed for LPFSs ...
Mingwei Lin, Chao Huang, Zeshui Xu
semanticscholar +1 more source
Measures of maximal relative entropy [PDF]
Given an irreducible subshift of finite type X, a subshift Y, a factor map : X \to Y, and an ergodic invariant measure on Y, there can exist more than one ergodic measure on X which projects to and has maximal entropy among all measures in the fiber, but there is an explicit bound on the number of such maximal entropy preimages.
Petersen, K +2 more
openaire +3 more sources
ENTROPY OF DISCRETE FUZZY MEASURES [PDF]
The concept of entropy of a discrete fuzzy measure has been recently introduced in two different ways. A first definition was proposed by Marichal in the aggregation framework, and a second one by Yager in the framework of uncertain variables. We present a comparative study between these two proposals and point out their properties.
Marichal, Jean-Luc, Roubens, Marc
openaire +4 more sources
Jensen–Inaccuracy Information Measure
The purpose of the paper is to introduce the Jensen–inaccuracy measure and examine its properties. Furthermore, some results on the connections between the inaccuracy and Jensen–inaccuracy measures and some other well-known information measures are ...
Omid Kharazmi +3 more
doaj +1 more source
IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment [PDF]
Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set (INS) environment, which we call IN-cross entropy measure and ...
Shyamal Dalapati +4 more
doaj +1 more source
Sequential measurements and entropy [PDF]
Abstract We sketch applications of the so-called J-equation to quantum information theory concerning fundamental properties of the von Neumann entropy. The J-equation has recently be proposed as a sort of progenitor of the various versions of the Jarzynski equation.
Schmidt, Heinz-Jürgen, Gemmer, Jochen
openaire +2 more sources
On the Measure Entropy of Additive Cellular Automata f∞
We show that for an additive one-dimensional cellular automata f∞ on space of all doubly infinitive sequences with values in a finite set S = {0, 1, 2, ..., r-1}, determined by an additive automaton rule [equation] (mod r), and a f∞-invariant ...
Hasan Akın
doaj +1 more source

