1. "Expert Choice" Mixture of Experts (MoE)
Frontier AI Drawings by Hand ✍️
The first set of drawings introduces Expert-Choice Mixture of Experts (MoE) and contrasts it with traditional token-choice MoE routing. You’ll see why token-choice routing leads to load imbalance and how Expert-Choice allows each expert to select its top tokens, keeping computation balanced and capacity fully used. This routing strategy was introduced by researchers at Google.
Q: Why Expert Choice routing?
A: Because traditional MoE (Token Choice) suffers from load imbalance—some experts get overloaded with tokens while others stay idle—wasting capacity.
Q: How does Expert Choice fix this?
A: By letting experts select their top tokens, it prevents any expert from being overloaded with tokens and keeps computation balanced.
Q: Who invented Expert Choice routing?
A: Researchers at Google.
This first release takes you into Expert-Choice Mixture of Experts (MoE) and shows you how this method contrasts with traditional MoEs that use token choice.
Q: Why Expert Choice routing?
A: Because traditional MoE (Token Choice) suffers from load imbalance—some experts get overloaded with tokens while others stay idle—wasting capacity.
Q: How does Expert Choice fix this?
A: By letting experts select their top tokens, it prevents any expert from being overloaded with tokens and keeps computation balanced.
Q: Who invented Expert Choice routing?
Researchers at Google.
Drawings
Download
The Frontier AI drawings are available to AI by Hand Academy members. You can become a member via a paid Substack subscription.



