Results 241 to 250 of about 128,139 (284)
Some of the next articles are maybe not open access.
Unifying Entropies of Quantum Logic Based on Rényi Entropies
Reports on Mathematical Physics, 2019zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Giski, Zahra Eslami +1 more
openaire +2 more sources
Logical Network Cost and Entropy
IEEE Transactions on Computers, 1973A measure of the minimum cost of a logical network is important in the evaluation of such networks. A hypothesis is investigated which states that the average minimum cost depends on both the number of input variables and the entropy (``entropy'' in this sense is based on the probability of a ``1'' in the functions' truth table) of the function ...
Robert W. Cook, Michael J. Flynn
openaire +1 more source
Conditional Entropy of Partitions on Quantum Logic
Communications in Theoretical Physics, 2007Summary: A construction of conditional entropy of partitions on quantum logic is given, and the properties of conditional entropy are investigated.
Zhao, Yue-Xu, Ma, Zhi-Hao
openaire +2 more sources
Tsallis Entropy of Partitions in Quantum Logics
International Journal of Theoretical Physics, 2018zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Ebrahimzadeh, Abolfazl +1 more
openaire +2 more sources
Logical entropy and aggregation of fuzzy orthopartitions
zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Stefania Boffa, Davide Ciucci
openalex +2 more sources
Further Developments of Logical Entropy
2021This chapter develops the multivariate (i.e., three or more variables) entropies. The Shannon mutual information is negative in the standard probability theory example of three random variables that are pair-wise independent but not mutually independent.
openaire +1 more source
Fuzzy orthopartitions and their logical entropy [PDF]
In this paper, we present special families of intuitionistic fuzzy sets called fuzzy orthopartitions, which are generalizations of standard fuzzy partitions useful to model situations where both vagueness and uncertainty are involved. Moreover, we define and explain how to compute the lower and upper entropy, which measure the quantity of information ...
Boffa S., Ciucci D.
openaire
Kolmogorov–Sinai type logical entropy for generalized simultaneous measurements
Reports on Mathematical Physics, 2021zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Shukla, Anurag +2 more
openaire +1 more source
Information, Entropy and Inductive Logic
Philosophy of Science, 1954It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of
openaire +1 more source
Simplicity, Entropy and Inductive Logic
1966Publisher Summary This chapter discusses simplicity that leads to an entropy-like measure and helps in settling problems of the application of inductive logic. It focuses on the explication of the notion of simplicity of state descriptions. The principle of simplicity plays an important role as a guide of inductive behavior: in situations where a ...
openaire +1 more source

