Results 41 to 50 of about 528,328 (297)
Measures of maximal relative entropy [PDF]
Given an irreducible subshift of finite type X, a subshift Y, a factor map : X \to Y, and an ergodic invariant measure on Y, there can exist more than one ergodic measure on X which projects to and has maximal entropy among all measures in the fiber, but there is an explicit bound on the number of such maximal entropy preimages.
Petersen, K +2 more
openaire +3 more sources
A New Development of Entropy and Similarity Measures in Temporal Complex Neutrosophic Environments for Tourist Destination Selection [PDF]
In human existence, making decisions is a common event. Various techniques have been devised to tackle decision-making troubles in practical situations.
Florentin Smarandache +3 more
doaj +1 more source
Information measure for financial time series: quantifying short-term market heterogeneity [PDF]
A well-interpretable measure of information has been recently proposed based on a partition obtained by intersecting a random sequence with its moving average.
Carbone, Anna, Ponta, Linda
core +2 more sources
ENTROPY OF DISCRETE FUZZY MEASURES [PDF]
The concept of entropy of a discrete fuzzy measure has been recently introduced in two different ways. A first definition was proposed by Marichal in the aggregation framework, and a second one by Yager in the framework of uncertain variables. We present a comparative study between these two proposals and point out their properties.
Marichal, Jean-Luc, Roubens, Marc
openaire +4 more sources
Jensen–Inaccuracy Information Measure
The purpose of the paper is to introduce the Jensen–inaccuracy measure and examine its properties. Furthermore, some results on the connections between the inaccuracy and Jensen–inaccuracy measures and some other well-known information measures are ...
Omid Kharazmi +3 more
doaj +1 more source
Zero Krengel Entropy does not kill Poisson Entropy [PDF]
We prove that the notions of Krengel entropy and Poisson entropy for infinite-measure-preserving transformations do not always coincide: We construct a conservative infinite-measure-preserving transformation with zero Krengel entropy (the induced ...
De La Rue, Thierry, Janvresse, Élise
core +4 more sources
IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment [PDF]
Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set (INS) environment, which we call IN-cross entropy measure and ...
Shyamal Dalapati +4 more
doaj +1 more source
Sequential measurements and entropy [PDF]
Abstract We sketch applications of the so-called J-equation to quantum information theory concerning fundamental properties of the von Neumann entropy. The J-equation has recently be proposed as a sort of progenitor of the various versions of the Jarzynski equation.
Schmidt, Heinz-Jürgen, Gemmer, Jochen
openaire +2 more sources
An Entropy Measure of Non-Stationary Processes
Shannon’s source entropy formula is not appropriate to measure the uncertainty of non-stationary processes. In this paper, we propose a new entropy measure for non-stationary processes, which is greater than or equal to Shannon’s source entropy.
Ling Feng Liu +3 more
doaj +1 more source
In statistics, the correlation coefficient concept aims to show how strong the linear relationship between two variables is. Sometimes the data collected relates to everyday life problems whose value is uncertain.
Latifa Khairunnisa +2 more
doaj +1 more source

