Results 31 to 40 of about 247,902 (260)
Distribution of mutual information [PDF]
In the analysis of time series from nonlinear sources, mutual information (MI) is used as a nonlinear statistical criterion for the selection of an appropriate time delay in time delay reconstruction of the state space. MI is a statistic over the sets of sequences associated with the dynamical source, and we examine here the distribution of MI, thus ...
Abarbanel, Henry D. I. +3 more
openaire +2 more sources
DEM REGISTRATION BASED ON MUTUAL INFORMATION [PDF]
The registration process forms an important step in the integration of Digital Elevation Models (DEMs) of resolution. DEM registration invariably involves what is effectively an 'image matching' process utilising similarity number of prospective image ...
M. Ravanbakhsh, C. S. Fraser
doaj +1 more source
Mutual information superadditivity and unitarity bounds
We derive the property of strong superadditivity of mutual information arising from the Markov property of the vacuum state in a conformal field theory and strong subadditivity of entanglement entropy. We show this inequality encodes unitarity bounds for
Horacio Casini +2 more
doaj +1 more source
Mutual Information, Fisher Information, and Population Coding [PDF]
In the context of parameter estimation and model selection, it is only quite recently that a direct link between the Fisher information and information-theoretic quantities has been exhibited. We give an interpretation of this link within the standard framework of information theory.
Brunel, Nicolas, Nadal, Jean-Pierre
openaire +3 more sources
Estimating Information Processing of Human Fast Continuous Tapping from Trajectories
Fitts studied the problem of information capacity and transfer in the speed–accuracy motor paradigm using a theoretical approach developed from Shannon and Weaver’s information theory.
Hiroki Murakami, Norimasa Yamada
doaj +1 more source
Lower bounds on mutual information [PDF]
We correct claims about lower bounds on mutual information (MI) between real-valued random variables made in A. Kraskov {\it et al.}, Phys. Rev. E {\bf 69}, 066138 (2004). We show that non-trivial lower bounds on MI in terms of linear correlations depend on the marginal (single variable) distributions.
Foster, D.V., Grassberger, P.
openaire +4 more sources
Mutual Information: A way to quantify correlations
Within the framework of Information Theory, the existence of correlations between two random variables means that we can obtain information about one of them, just by measuring or observing the other random variable.
Marcelo Tisoc, Jhosep Victorino Beltrán
doaj +1 more source
Relationship between Information Scrambling and Quantum Darwinism
A quantum system interacting with a multipartite environment can induce redundant encoding of the information of a system into the environment, which is the essence of quantum Darwinism.
Feng Tian +4 more
doaj +1 more source
Natural image segmentation based on mutual information [PDF]
Natural image segmentation plays an important role in the fields of image processing and computer vision. Image segmentation based on clustering is an important method in unsupervised image segmentation algorithms.
Zhu Yiwei
doaj +1 more source
MULTI-FEATURE MUTUAL INFORMATION IMAGE REGISTRATION
Nowadays, information-theoretic similarity measures, especially the mutual information and its derivatives, are one of the most frequently used measures of global intensity feature correspondence in image registration.
Dejan Tomaževič +2 more
doaj +1 more source

