mixture-of-experts

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

A mixture of experts model has a series of sub-models each with experties ina partciular area linked by a more general purpose part that is able to activate the relevant experts when required. It can be seen as a form of {[ensemble method}} and is used in the DeepSeek large language model to increase computational efficiency.

Used in Chap. 16: page 267; Chap. 23: page 393

Also known as MoE