gradient descent

Terms from Artificial Intelligence: humans at the heart of algorithms

The glossary is being gradually proof checked, but currently has many typos and misspellings.

Gradient descent tries to find the lowest point in a fitness landscape by following the direction with the fastest drop in value. It is equivalent to gradient ascent, but with the fitness function inverted.
Simulated annealing is form of probabalistic gradient descent where the gradient is used to determine the probability of following the direction, rather than always following the very fastest route downwards. Backpropagation can also be viewed as a form of gradient descent minimising the difference between the actual and expected outputs of the neural network, but which is perfromed incrementally for each training example in turn.

Used in Chap. 7: page 96; Chap. 9: pages 122, 123, 124; Chap. 12: page 183