variance

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

Variance is a statistical term that measures the mean-square difference bwteen a value and its expected value (often its mean, μ). It is usually written σ2, and it is the square of the standard deviatoon (σ), which is often a more meaningful measure, albeit less useful for calculations. For a simple series of data items (xi) the variance is calculated by
      σ2 = ( Σi (xi &munus; μ) 2 However, for more complicated models the expected value for each data item may be differenct.

Defined on page 139

Used on Chap. 7: pages 139, 140, 141, 142; Chap. 10: page 207