The glossary is being gradually proof checked, but currently has many typos and misspellings.
Distillation is the process of training one machine learning model from another. It may be used to create a 'student' that is smaller, more efficient, or better at generalisation than it's parent.