accuracy

Terms from Artificial Intelligence: humans at the heart of algorithms

Accuracy is the difference between the output or prediction of an algorithm and the true value. For numerical results it is often measured using the mean square error (the average of the square of the difference) or absolute difference. For classifications there are several different kinds of accuracy measures that are important, including precision and recall for binary decisions, so the word 'accuracy can be ambiguous.

Used in Chap. 8: page 105; Chap. 9: pages 119, 124; Chap. 12: page 178; Chap. 14: page 212; Chap. 15: pages 228, 233; Chap. 18: pages 288, 290; Chap. 19: pages 299, 302, 304, 306; Chap. 20: pages 317, 323; Chap. 21: page 332; Chap. 23: pages 364, 368