Distillation is the process of training one machine learning model from another. It may be used to create a 'student' that is smaller, more efficient, or better at generalisation than it's parent.
Used in glossary entries: machine learning
Used in glossary entries: machine learning