word2vec

Terms from Artificial Intelligence: humans at the heart of algorithms

word2vec creates a vector representation of a word that captures certaib aspects of the menaing of the word. It works by first creating a 'Skip-gram model' that predicts the likelihood of curroundng words, and then extracts an internal layer of this model to form a latent space. The value of a word in this latent space is called a word vector. The geometry of the latent space often captures crucially relationships, so that, for example, the vector difference between 'dog' and 'puppy' is similar to that between 'cat' and 'kitten'. These word vectors can be used in further steps of an NLP process.

Used on pages 314, 528