Envisioning LogoEnvisioning
HomeServicesWorkSignalsVocabAbout
Vocab
1092 entries
GridGraphSunflowerInfo

Robert A. Jacobs

(1 article)
MoE (Mixture of Experts)
1991

MoE
Mixture of Experts

ML architecture that utilizes multiple specialist models (experts) to handle different parts of the input space, coordinated by a gating mechanism that decides which expert to use for each input.

Generality: 705