variance

Terms from Artificial Intelligence: humans at the heart of algorithms

Variance is a statistical term that measures the mean-square difference bwteen a value and its expected value (often its mean, μ). It is usually written σ2, and it is the square of the standard deviatoon (σ), which is often a more meaningful measure, albeit less useful for calculations. For a simple series of data items (xi) the variance is calculated by
      σ2 = ( Σi (xi &munus; μ) 2 However, for more complicated models the expected value for each data item may be differenct.

Defined on page 139

Used on pages 139, 141