A mixture of experts model has a series of sub-models each with experties ina partciular area linked by a more general purpose part that is able to activate the relevant experts when required. It can be seen as a form of {[ensemble method}} and is used in the DeepSeek large language model to increase computational efficiency.
Used in Chap. 16: page 267; Chap. 23: page 393
Also known as MoE