Results 81 to 90 of about 343,367 (193)
An Information-Theoretic Foundation for the Weighted Updating Model
Weighted Updating generalizes Bayesian updating, allowing for biased beliefs by weighting the likelihood function and prior distribution with positive real exponents.
Zinn, Jesse Aaron
core +1 more source
A key tenet of the Transactional Interpretation of Quantum Mechanics is the idea that photon absorption localizes the absorbing material system. In doing so, it measures the location of the absorber and hence reduces information entropy which in turn ...
Andreas Schlatter, R. E. Kastner
doaj +1 more source
The analogues of entropy and of Fisher's information measure in free probability theory [PDF]
In Part I [Commun. Math. Phys. 155, No. 1, 71-92 (1993; Zbl 0781.60006)] we studied the free entropy \(\Sigma(X)\) of a self-adjoint random variable \(X\) equal to minus the logarithmic energy of its distribution. We also introduced a generalization \(\Sigma(X_ 1,\dots, X_ n)\) which, however, does not have the necessary properties for \(n \geq 2\) to ...
openaire +4 more sources
Measuring Dynamical Uncertainty With Revealed Dynamics Markov Models
Concepts and measures of time series uncertainty and complexity have been applied across domains for behavior classification, risk assessments, and event detection/prediction.
Aaron Bramson +5 more
doaj +1 more source
Many disciplines, including environmental sciences, reliability engineering, hydrology, and information theory, rely heavily on information measures derived from statistical distributions for decision-making, risk assessment, and system design. Motivated
I. A. Husseiny +4 more
doaj +1 more source
Dynamics of Information Quantifiers in the Damped Rabi Oscillator
This study examines the time evolution of structural and informational quantifiers in a damped Rabi oscillator, specifically focusing on fidelity, entropy, disequilibrium, and Fisher information.
Flavia Pennini, Angelo Plastino
doaj +1 more source
Though Shannon entropy of a probability measure $P$, defined as $- \int_{X} \frac{\ud P}{\ud } \ln \frac{\ud P}{\ud } \ud $ on a measure space $(X, \mathfrak{M}, )$, does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are ...
Dukkipati, Ambedkar +2 more
openaire +2 more sources
Social Entropy: An Information Measure of Institutional Complexity
Information theory has impacted disciplines across science and technology, including the 20th century’s IT revolution. However, information theory has been applied much less to the social sciences and has not been applied at all to theoretical social psychology.
openaire +1 more source
On Dynamical Measures of Quantum Information
In this work, we use the theory of quantum states over time to define joint entropy for timelike-separated quantum systems. For timelike-separated systems that admit a dual description as being spacelike-separated, our notion of entropy recovers the ...
James Fullwood, Arthur J. Parzygnat
doaj +1 more source
Complex networks play a vital role in various real-world systems, including marketing, information dissemination, transportation, biological systems, and epidemic modeling.
Ramya D. Shetty +3 more
doaj +1 more source

