Open SourceGenerative AI Revolution Mixtral 8x7B: Efficient Mixture of Experts
French startup Mistral AI released Mixtral 8x7B, a mixture-of-experts model that matched or beat GPT-3.5 while using a fraction of the compute per token. It demonstrated that clever architecture could compete with brute-force scaling.
Mistral AI