eigenvector

Terms from Artificial Intelligence: humans at the heart of algorithms

The glossary is being gradually proof checked, but currently has many typos and misspellings.

An eigenvector of a square matrix is a vector which when multiplied by the matrix ends up as a multiple of itself. In otherwords, the eigenvector is unchanged except for scale. Mathentaically, if M is the matrix and v an eigenvector, Mv = λv. The scale factor λ is known as the eigenvalue of the eigenvector. Eigenvectors have many uses, both directly represented within algorithms, or as a way to explain other processes. As an an example of the former, if a cross-correlation matrix is calculated for a dataset, and the eigenvectors with large eigenvalues computed, these correspond to the directions in which the data has greatest variation: a process called principle components analysis. As an example of the latter (explaining another process), Google PageRank algorithm chases links, spreading a measure of importance from page to page; however this effectively computes the principal eigenvector (the one with largest eigenvalue) of the web graph represented as a connection matrix.

Used in Chap. 7: pages 91, 92, 100, 101; Chap. 8: page 107

Also known as eigenvalues