entropy

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

Entropy is a mathematical measure of the information in a system. For a single item of data with a number of possible values, each with probability pi, the entropy is defined as follows:
    entropy = − Σ pi × log2 pi
This is at a maximum when all the pi are equal, which slightly paradoxically can be seen as a state of maximum disorder, in the sense that therre is not a preference to one value or another. The term originated in the study of thermodynamics in physics, but was coined as an information measure by Shannon and Weaver in the late 1940s

Defined on page 206

Used on Chap. 5: pages 98, 101; Chap. 9: page 188; Chap. 10: page 206; Chap. 21: page 522

Also known as entropy measure, information entropy