accuracy

Terms from Artificial Intelligence: humans at the heart of algorithms

Accuracy is the difference between the output or prediction of an algorithm and the true value. For numerical results it is often measured using the mean square error (the average of the square of the difference) or absolute difference. For classifications there are several different kinds of measures that are important, including precision and recall for binary decisions, so the word 'accuracy can be ambiguous.

Used on page 174