accuracy

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

Accuracy is the difference between the output or prediction of an algorithm and the true value. For numerical results it is often measured using the mean square error (the average of the square of the difference) or absolute difference. For classifications there are several different kinds of accuracy measures that are important, including precision and recall for binary decisions, so the word 'accuracy can be ambiguous.

Used on Chap. 8: page 157; Chap. 9: pages 178, 180, 187; Chap. 12: page 270; Chap. 14: page 329; Chap. 15: pages 355, 363; Chap. 18: pages 447, 450; Chap. 19: pages 465, 469, 472, 473, 476; Chap. 20: pages 493, 502; Chap. 21: page 516; Chap. 23: pages 565, 571, 572