entropy

Terms from Artificial Intelligence: humans at the heart of algorithms

Entropy is a mathematical measure of the information in a system. For a single item of data with a number of possible values, each with probability pi, the entrop is defined as follows:
entropy = − Σ pi × log2 pi
This is at a maximum when all the pi are equal, which slightly paradoxically can be seen as a state of maximum disorder, in the sense that therre is not a preference to one value or another. The term originated in the study of thermodyyamincs in physics, but was coined as an information measure by Shannon and Weaver in the late 1940s

Used on pages 98, 101, 201

Also known as entropy measure, information entropy