word2vec creates a vector representation of a word that captures certain aspects of the meaning of the word. It works by first creating a 'skip-gram model' that predicts the likelihood of surrounding words, and then extracts an internal layer of this model to form a latent space. The value of a word in this latent space is called a word vector. The geometry of the latent space often captures crucial relationships, so that, for example, the vector difference between 'dog' and 'puppy' is similar to that between 'cat' and 'kitten'. These word vectors can be used in further steps of an NLP process.