The entropy function H( X) is a measure of the uncertainty of X, in formula
where p X( a) = Pr [ x = A] denotes the probability that random variable X takes on value a. The interpretation is that with probability p X ( a), X can be described by log 2 p X ( a) bits of information.
The conditional entropy or equivocation (Shannon 1948) H( X ∕ Y) denotes the uncertainty of X provided Y is known:
where p X, Y ( a, b)= def Pr [( X = a) ∧ ( Y = b)] and p X| Y ( a| b) obeys Bayes’ rule for conditional probabilities:
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Recommended Reading
Bauer FL (1997) Decrypted secrets. In: Methods and maxims of cryptology. Springer, Berlin
McEliece RJ (1977) The theory of information and coding. In: Encyclopedia of mathematics and its applications, vol 3. Addison-Wesley, Reading
Shannon CE (1948) A Mathematical Theory of Communication. Bell Sys Tech J 27:379–423 (luglio), 623–656 (ottobre). https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Section Editor information
Rights and permissions
Copyright information
© 2025 Springer Nature Switzerland AG
About this entry
Cite this entry
Bauer, F.L. (2025). Information Theory. In: Jajodia, S., Samarati, P., Yung, M. (eds) Encyclopedia of Cryptography, Security and Privacy. Springer, Cham. https://doi.org/10.1007/978-3-030-71522-9_169
Download citation
DOI: https://doi.org/10.1007/978-3-030-71522-9_169
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-71520-5
Online ISBN: 978-3-030-71522-9
eBook Packages: Computer ScienceReference Module Computer Science and Engineering