Results 11 to 20 of about 977,806 (290)

Logical Entropy of Dynamical Systems—A General Model [PDF]

open access: yesMathematics, 2017
In the paper by Riečan and Markechová (Fuzzy Sets Syst. 96, 1998), some fuzzy modifications of Shannon’s and Kolmogorov-Sinai’s entropy were studied and the general scheme involving the presented models was introduced.
Abolfazl Ebrahimzadeh   +2 more
doaj   +4 more sources

Logical entropy of quantum dynamical systems

open access: yesOpen Physics, 2016
This paper introduces the concepts of logical entropy and conditional logical entropy of hnite partitions on a quantum logic. Some of their ergodic properties are presented.
Ebrahimzadeh Abolfazl
doaj   +3 more sources

Introduction to Logical Entropy and its Relationship to Shannon Entropy [PDF]

open access: yesSSRN Electronic Journal, 2021
We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy ...
D. Ellerman
semanticscholar   +6 more sources

Logical entropy of dynamical systems in product MV-algebras and general scheme

open access: yesAdvances in Difference Equations, 2019
The present paper is aimed at studying the entropy of dynamical systems in product MV-algebras. First, by using the concept of logical entropy of a partition in a product MV-algebra introduced and studied by Markechová et al.
Dagmar Markechová, Beloslav Riečan
doaj   +3 more sources

Evaluation and optimization of resource matching for perception services in power communication networks [PDF]

open access: yesScientific Reports
In the cloud–edge–end communication architecture of the new power system, heterogeneous perception services face a fundamental and long-standing demand–supply mismatch with multi-dimensional resources (computing, storage, spectrum/bandwidth, and power ...
Lei Wei   +4 more
doaj   +2 more sources

On Defining Expressions for Entropy and Cross-Entropy: The Entropic Transreals and Their Fracterm Calculus

open access: yesEntropy
Classic formulae for entropy and cross-entropy contain operations x0 and log2x that are not defined on all inputs. This can lead to calculations with problematic subexpressions such as 0log20 and uncertainties in large scale calculations; partiality also
Jan A. Bergstra, John V. Tucker
doaj   +2 more sources

Common Neighbor Completion with Information Entropy for Link Prediction in Social Networks

open access: yesData Science and Engineering
Link prediction is essential for identifying hidden relationships within network data, with significant implications for fields such as social network analysis and bioinformatics.
Zhengyun Zhou, Guojia Wan, Bo Du
doaj   +2 more sources

Relative model of the logical entropy of sub-$\sigma_\Theta$-algebras [PDF]

open access: yesJournal of Mahani Mathematical Research, 2023
‎In the context of observers‎, ‎any mathematical model according to the viewpoint of an observer $ \Theta $ is called a relative model‎. ‎The purpose of the present paper is to study the relative model of logical entropy‎. ‎Given an observer $ \Theta $‎,
Uosef Mohammadi
doaj   +1 more source

An Introduction to Logical Entropy and its Relation to Shannon Entropy [PDF]

open access: yesInternational Journal of Semantic Computing, 2013
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set
D. Ellerman
semanticscholar   +2 more sources

Introduction to logical entropy and its relationship to Shannon entropy

open access: yes4 open, 2022
We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy ...
Ellerman David
doaj   +1 more source

Home - About - Disclaimer - Privacy