Results 31 to 40 of about 749,941 (338)
Information Distances versus Entropy Metric
Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years.
Bo Hu, Lvqing Bi, Songsong Dai
doaj +1 more source
Real-time driving monitoring systems can use self-powered sensors based on artificial intelligence (AI) and triboelectric nanogenerators (TENGs). Here, we created a TENG-based self-powered intelligent steering wheel that can detect hand gripping.
Fengguang Fan +9 more
doaj +1 more source
Information Entropy As a Basic Building Block of Complexity Theory
What is information? What role does information entropy play in this information exploding age, especially in understanding emergent behaviors of complex systems?
Jing Hu +4 more
doaj +1 more source
Area Entropy and Quantized Mass of Black Holes from Information Theory
In this paper, we present a derivation of the black hole area entropy with the relationship between entropy and information. The curved space of a black hole allows objects to be imaged in the same way as camera lenses.
Dongshan He, Qingyu Cai
doaj +1 more source
Computability of entropy and information in classical Hamiltonian systems [PDF]
We consider the computability of entropy and information in classical Hamiltonian systems. We define the information part and total information capacity part of entropy in classical Hamiltonian systems using relative information under a computable ...
Aberth +23 more
core +1 more source
Quantum Information Dimension and Geometric Entropy
Geometric quantum mechanics, through its differential-geometric underpinning, provides additional tools of analysis and interpretation that bring quantum mechanics closer to classical mechanics: state spaces in both are continuous and equipped with ...
Fabio Anza, James P. Crutchfield
doaj +1 more source
Logical Entropy: Introduction to Classical and Quantum Logical Information theory [PDF]
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets.
Ellerman, David
core +2 more sources
Partition-Symmetrical Entropy Functions
Let $\cal{N}=\{1,\cdots,n\}$. The entropy function $\bf h$ of a set of $n$ discrete random variables $\{X_i:i\in\cal N\}$ is a $2^n$-dimensional vector whose entries are ${\bf{h}}({\cal{A}})\triangleq H(X_{\cal{A}}),\cal{A}\subset{\cal N} $, the (joint ...
Chen, Qi, Yeung, Raymond W.
core +1 more source
Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains
The spiking activity of neuronal networks follows laws that are not time-reversal symmetric; the notion of pre-synaptic and post-synaptic neurons, stimulus correlations and noise correlations have a clear time order.
Rodrigo Cofré, Cesar Maldonado
doaj +1 more source
Information and entropy in quantum Brownian motion: Thermodynamic entropy versus von Neumann entropy
We compare the thermodynamic entropy of a quantum Brownian oscillator derived from the partition function of the subsystem with the von Neumann entropy of its reduced density matrix.
A. Lenard +38 more
core +1 more source

