Results 1 to 10 of about 977,637 (144)
Logical Entropy: Introduction to Classical and Quantum Logical Information Theory [PDF]
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets.
David Ellerman
doaj +7 more sources
Logical Entropy of Information Sources [PDF]
In this paper, we present the concept of the logical entropy of order m, logical mutual information, and the logical entropy for information sources. We found upper and lower bounds for the logical entropy of a random variable by using convex functions ...
Peng Xu, Yamin Sayyari, Saad Ihsan Butt
doaj +3 more sources
Logical Divergence, Logical Entropy, and Logical Mutual Information in Product MV-Algebras [PDF]
In the paper we propose, using the logical entropy function, a new kind of entropy in product MV-algebras, namely the logical entropy and its conditional version.
Dagmar Markechová +2 more
doaj +3 more sources
SE-MSLC: Semantic Entropy-Driven Keyword Analysis and Multi-Stage Logical Combination Recall for Search Engine [PDF]
Information retrieval serves as a critical methodology for accurately and efficiently obtaining the required information from massive amounts of data. In this paper, we propose an information retrieval framework (SE-MSLC) that utilizes information theory
Haihua Lu +3 more
doaj +3 more sources
Quantum logical entropy: fundamentals and general properties [PDF]
Logical entropy gives a measure, in the sense of measure theory, of the distinctions of a given partition of a set, an idea that can be naturally generalized to classical probability distributions.
Tamir Boaz +3 more
doaj +2 more sources
Logical entropy – special issue
Entropy is a fundamental quantity in many areas of knowledge, from physics to information science to biology. Originally put forward in the nineteenth century for very practical purposes (to quantify the reversibility of thermodynamic cycles, hence of ...
Manfredi Giovanni
doaj +2 more sources
Logical entropy and negative probabilities in quantum mechanics [PDF]
The concept of Logical Entropy, SL = 1-Σni=1 pi2 SL=1-∑i=1npi2$ {S}_{\mathrm{L}}=1-{\sum }_{i=1}^n {p}_i^2$, where the pi are normalized probabilities, was introduced by David Ellerman in a series of recent papers.
Manfredi Giovanni
doaj +2 more sources
Logical entropy of dynamical systems
The main purpose of the paper is to extend the results of Ellerman (Int. J. Semant. Comput. 7:121–145, 2013) to the case of dynamical systems. We define the logical entropy and conditional logical entropy of finite measurable partitions and derive the ...
Dagmar Markechová +2 more
doaj +3 more sources
Logical Entropy of Fuzzy Dynamical Systems
Recently the logical entropy was suggested by D. Ellerman (2013) as a new information measure. The present paper deals with studying logical entropy and logical mutual information and their properties in a fuzzy probability space.
Dagmar Markechová, Beloslav Riečan
doaj +2 more sources
Logical Entropy and Logical Mutual Information of Experiments in the Intuitionistic Fuzzy Case
In this contribution, we introduce the concepts of logical entropy and logical mutual information of experiments in the intuitionistic fuzzy case, and study the basic properties of the suggested measures. Subsequently, by means of the suggested notion of
Dagmar Markechová, Beloslav Riečan
doaj +2 more sources

