mixture-of-experts

Terms from Artificial Intelligence: humans at the heart of algorithms

A mixture of experts model has a series of sub-models each with experties ina partciular area linked by a more general purpose part that is able to activate the relevant experts when required. It can be seen as a form of {[ensemble method}} and is used in the DeepSeek large language model to increase computational efficiency.

Used in Chap. 16: page 248; Chap. 23: page 370

Also known as MoE