Results 31 to 40 of about 1,878,628 (322)
Mutual Information: A way to quantify correlations
Within the framework of Information Theory, the existence of correlations between two random variables means that we can obtain information about one of them, just by measuring or observing the other random variable.
Marcelo Tisoc, Jhosep Victorino Beltrán
doaj +1 more source
MIHash: Online Hashing with Mutual Information [PDF]
Learning-based hashing methods are widely used for nearest neighbor retrieval, and recently, online hashing methods have demonstrated good performance-complexity trade-offs by learning hash functions from streaming data. In this paper, we first address a
Bargal, Sarah Adel +3 more
core +1 more source
Relationship between Information Scrambling and Quantum Darwinism
A quantum system interacting with a multipartite environment can induce redundant encoding of the information of a system into the environment, which is the essence of quantum Darwinism.
Feng Tian +4 more
doaj +1 more source
Mutual information on the fuzzy sphere [PDF]
We numerically calculate entanglement entropy and mutual information for a massive free scalar field on commutative (ordinary) and noncommutative (fuzzy) spheres.
Sabella-Garnier, Philippe
core +2 more sources
MULTI-FEATURE MUTUAL INFORMATION IMAGE REGISTRATION
Nowadays, information-theoretic similarity measures, especially the mutual information and its derivatives, are one of the most frequently used measures of global intensity feature correspondence in image registration.
Dejan Tomaževič +2 more
doaj +1 more source
Mutual Information and Information Gating in Synfire Chains
Coherent neuronal activity is believed to underlie the transfer and processing of information in the brain. Coherent activity in the form of synchronous firing and oscillations has been measured in many brain regions and has been correlated with enhanced
Zhuocheng Xiao +3 more
doaj +1 more source
On the Estimation of Mutual Information
In this paper we focus on the estimation of mutual information from finite samples ( X × Y ) . The main concern with estimations of mutual information (MI) is their robustness under the class of transformations for which it remains invariant:
Nicholas Carrara, Jesse Ernst
doaj +1 more source
Mutual information for testing gene-environment interaction. [PDF]
Despite current enthusiasm for investigation of gene-gene interactions and gene-environment interactions, the essential issue of how to define and detect gene-environment interactions remains unresolved.
Xuesen Wu, Li Jin, Momiao Xiong
doaj +1 more source
Quantifying Synergistic Mutual Information [PDF]
Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts.
Griffith, Virgil, Koch, Christof
openaire +3 more sources
Structure Learning of Bayesian Network Based on Adaptive Thresholding
Direct dependencies and conditional dependencies in restricted Bayesian network classifiers (BNCs) are two basic kinds of dependencies. Traditional approaches, such as filter and wrapper, have proved to be beneficial to identify non-significant ...
Yang Zhang +3 more
doaj +1 more source

