Results 331 to 340 of about 21,680,450 (363)
Some of the next articles are maybe not open access.
British Journal of Applied Physics, 1953
Publisher Summary The “information rate” of a communication system might be defined as the average gain of information per unit time at the receiving end, and one of the most engaging problems of communication theory is to calculate the maximum value of the information rate under various conditions.
openaire +4 more sources
Publisher Summary The “information rate” of a communication system might be defined as the average gain of information per unit time at the receiving end, and one of the most engaging problems of communication theory is to calculate the maximum value of the information rate under various conditions.
openaire +4 more sources
1993
Publisher Summary This chapter discusses the concepts of information and the related quantity entropy. Mutual and self-information concepts are applied to various models for communications channels. In nature, it is found that self-contained systems change from more highly organized structures to less organized ones, that is, from states of higher ...
openaire +3 more sources
Publisher Summary This chapter discusses the concepts of information and the related quantity entropy. Mutual and self-information concepts are applied to various models for communications channels. In nature, it is found that self-contained systems change from more highly organized structures to less organized ones, that is, from states of higher ...
openaire +3 more sources
The Integrated Information Theory of Consciousness
, 2017The integrated information theory (IIT) starts from phenomenology and makes a critical use of thought experiments, to claim that consciousness is integrated information. Specifically: (1) the quantity of consciousness is given by the amount of integrated
G. Tononi
semanticscholar +1 more source
The Nature of Theory in Information Systems
MIS Q., 2006The aim of this research essay is to examine the structural nature of theory in Information Systems. Despite the importance of theory, questions relating to its form and structure are neglected in comparison with questions relating to epistemology.
S. Gregor
semanticscholar +1 more source
Information theory and genetics [PDF]
A description of the kinds of systems susceptible to information theoretical analysis is given. By means of an example, certain common fallacies in the application of communication theory to biology are illustrated. The entropy-information analogy is discussed.
openaire +2 more sources
A history of the theory of information
Proceedings of the IEE - Part III: Radio and Communication Engineering, 1951The paper mentions first some essential points about the early development of languages, codes and symbolism, picking out those fundamental points in human communication which have recently been summarized by precise mathematical theory. A survey of telegraphy and telephony development leads to the need for “economy,” which has given rise to various ...
openaire +3 more sources
Algorithmic Information Theory [PDF]
Algorithmic information theory uses the notion of algorithm to measure the amount of information in a finite object. The corresponding definition was suggested in 1960s by Ray Solomonoff, Andrei Kolmogorov, Gregory Chaitin and others: the amount of information in a finite object, or its complexity, was defined as the minimal length of a program that ...
openaire +1 more source
On Electromagnetics and Information Theory
IEEE Transactions on Antennas and Propagation, 2008Some connections are described between electromagnetic theory and information theory, identifying some unavoidable limitations imposed by the laws of electromagnetism to communication systems. Starting from this result, the role of the degrees of freedom of the field in radiating systems is investigated.
openaire +2 more sources
Complexity and Information Theory
1975The concept of “information” appeared in Physics in connection with the concept of “entropy”. It was observed (Boltzmann, 1896) in the framework of statistical thermodynamics, that the entropy is proportional to the logarithm of the number of alternatives (or microscopic states) which are possible for a physical system knowing all the macroscopic ...
openaire +3 more sources