Entropy is a mathematical measure of the information in a system. For a single item of data with a number of possible values, each with probability pi, the entrop is defined as follows:
entropy = − Σ pi × log2 pi
This is at a maximum when all the pi are equal, which slightly paradoxically can be seen as a state of maximum disorder, in the sense that therre is not a preference to one value or another. The term originated in the study of thermodyyamincs in physics, but was coined as an information measure by Shannon and Weaver in the late 1940s
Defined on page 206
Used on pages 99, 101, 188, 206, 522
Also known as entropy measure, information entropy
Links:
- Wikipedia: Entropy (information theory)