1 Shannon's Measure of Information

https://doi.org/10.1016/S0076-5392(08)62730-7Get rights and content

Publisher Summary

This chapter discusses the Shannon's measure of information. The concept of entropy of an experiment, introduced by Shannon, is fundamental in information theory. The functions assign to every finite probability distribution a real number and this is called the Shannon entropy of the probability distribution of an experiment. A connection between Shannon entropies of incomplete and of complete probability distributions that can be applied to the transition from one to the other is described in the chapter. There are several ways to introduce the Shannon entropy. Some deal with a problem of information theory, where expression appears and calls it entropy, and others first introduce and then show by example that it has a distinguished role as a measure of information. Another way of introducing the Shannon entropy is to characterize it axiomatically, in terms of some natural properties, which are essential from the point of view of information theory. In the axiomatic treatment of the measures of information, it is shown that only entropy has these properly selected properties.

References (0)

Cited by (1)

View full text