Abstract
Section 14.1 presents the definitions and key properties of information and entropy. Section 14.2 discusses the entropy of a (stationary) finite Markov chain. The Law of Large Numbers is proved for the amount of information contained in a message that is a long sequence of successive states of a Markov chain, and the asymptotic behaviour of the number of the most common states in a sequence of successive values of the chain is established. Applications of this result to coding are discussed.
The notions of the “amount of information” and “entropy” were introduced by C.E. Shannon in 1948. For some special situations the notion of amount of information had also been considered in earlier papers (e.g., by R.V.L. Hartley, 1928). The exposition in Sect. 14.2 of this chapter is substantially based on the paper of A.Ya. Khinchin [21].
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
See, e.g., [11].
References
Feinstein, A.: Foundations of Information Theory. McGraw-Hill, New York (1995)
Khinchin, A.Ya.: Ponyatie entropii v teorii veroyatnostei (The concept of entropy in the theory probability). Usp. Mat. Nauk 8, 3–20 (1953) (in Russian)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag London
About this chapter
Cite this chapter
Borovkov, A.A. (2013). Information and Entropy. In: Probability Theory. Universitext. Springer, London. https://doi.org/10.1007/978-1-4471-5201-9_14
Download citation
DOI: https://doi.org/10.1007/978-1-4471-5201-9_14
Publisher Name: Springer, London
Print ISBN: 978-1-4471-5200-2
Online ISBN: 978-1-4471-5201-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)