Results 221 to 230 of about 247,902 (260)
Some of the next articles are maybe not open access.
Mutual Information, Fisher Information, and Efficient Coding
Neural Computation, 2016Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998 ), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought.
Wei, Xue-Xin, Stocker, Alan A.
openaire +3 more sources
Maximum independence and mutual information
IEEE Transactions on Information Theory, 2002Summary: If \(I_1, I_2,\ldots, I_k\) are random Boolean variables and the joint probabilities up to the \((k-1)\)th order are known, the values of the \(k\)th-order probabilities maximizing the overall entropy have been defined as the maximum independence estimate. In this article, some contributions deriving from the definition of maximum independence
openaire +1 more source
Mutual Information and Categorical Perception
Psychological Science, 2021Categorical perception refers to the enhancement of perceptual sensitivity near category boundaries, generally along dimensions that are informative about category membership. However, it remains unclear exactly which dimensions are treated as informative and why.
openaire +2 more sources
Mechanical Systems and Signal Processing, 2010
Abstract Three new mutual information algorithms are raised for time delay in the phase space reconstruction process. Firstly, Cellucci’s mutual information algorithm is analyzed based on partitioning plane, which is constructed by a pair of Lorenz series with the same size, into four and sixteen grids with equal distribution probability in elements ...
Ai-Hua Jiang +5 more
openaire +1 more source
Abstract Three new mutual information algorithms are raised for time delay in the phase space reconstruction process. Firstly, Cellucci’s mutual information algorithm is analyzed based on partitioning plane, which is constructed by a pair of Lorenz series with the same size, into four and sixteen grids with equal distribution probability in elements ...
Ai-Hua Jiang +5 more
openaire +1 more source
Multifeature mutual information
SPIE Proceedings, 2004In the last decade information-theoretic similarity measures, especially mutual information and its derivatives, have proven to be accurate measures for rigid and non-rigid, mono- and multi-modal image registration. However, these measures are sometimes not robust enough, especially in cases of poor image quality. This is most likely due to the lack of
Dejan Tomazevic +2 more
openaire +1 more source
Information acquisition and mutual funds
Journal of Economic Theory, 2005zbMATH Open Web Interface contents unavailable due to conflicting licenses.
García, Diego, Vanden, Joel M.
openaire +1 more source
Mutual-information-based registration of medical images: a survey
IEEE Transactions on Medical Imaging, 2003Jba Maintz, Max A Viergever
exaly

