Results 71 to 80 of about 1,444,996 (97)

Neural activity responsiveness by maturation of inhibition underlying critical period plasticity. [PDF]

open access: yesFront Neural Circuits
Matsumoto I   +7 more
europepmc   +1 more source

MIMR-DGSA: Unsupervised hyperspectral band selection based on information theory and a modified discrete gravitational search algorithm

open access: yesInformation Fusion, 2019
Julius Tschannerl   +7 more
semanticscholar   +1 more source
Some of the next articles are maybe not open access.

Related searches:

Electromagnetic Signal and Information Theory

IEEE BITS the Information Theory Magazine, 2023
In this article, we present electromagnetic signal and information theory (ESIT). ESIT is an interdisciplinary scientific discipline, which amalgamates electromagnetic theory, signal processing theory, and information theory.
M. D. Renzo, M. Migliore
semanticscholar   +1 more source

Network Information Theory

2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT), 2021
Course Description: This course covers information theory as it relates to networked communication systems. The course will broadly be broken into three parts: (1) Study of single-hop networks, such as the multiple-access channel, broadcast channel ...
Leeanne Sagona
semanticscholar   +1 more source

Common randomness in information theory and cryptography - I: Secret sharing

IEEE Transactions on Information Theory, 1993
As the first part of a study of problems involving common randomness at distance locations, information-theoretic models of secret sharing (generating a common random key at two terminals, without letting an eavesdropper obtain information about this key)
R. Ahlswede, I. Csiszár
semanticscholar   +1 more source

Information Theory and Statistics: A Tutorial

Foundations and Trends in Communications and Information Theory, 2004
This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often ...
I. Csiszár, P. Shields
semanticscholar   +1 more source

Common Randomness in Information Theory and Cryptography - Part II: CR Capacity

IEEE Transactions on Information Theory, 1998
For pt.I see ibid., vol.39, p.1121, 1993. The common randomness (CR) capacity of a two-terminal model is defined as the maximum rate of common randomness that the terminals can generate using resources specified by the given model.
R. Ahlswede, I. Csiszár
semanticscholar   +1 more source

Information Distance

IEEE Transactions on Information Theory, 1998
While Kolmogorov (1965) complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures.
Charles H. Bennett   +4 more
semanticscholar   +1 more source

Strong Converse Bounds in Quantum Network Information Theory

IEEE Transactions on Information Theory, 2021
In this paper, we develop the first method for finding strong converse bounds in quantum network information theory. The general scheme relies on a recently obtained result in the field of non-commutative functional inequalities, namely the tensorization
Hao-Chung Cheng, N. Datta, C. Rouzé
semanticscholar   +1 more source

Home - About - Disclaimer - Privacy