Distillation is the process of trainoing one machine learning model from another. It may be used to create a 'student' that is smaller, more efficient, or better at generalisation than it's parent.
Distillation is the process of trainoing one machine learning model from another. It may be used to create a 'student' that is smaller, more efficient, or better at generalisation than it's parent.