Efficiency
2 milestones in AI history
Open SourceGenerative AI Revolution
Mixtral 8x7B: Efficient Mixture of Experts
French startup Mistral AI released Mixtral 8x7B, a mixture-of-experts model that matched or beat GPT-3.5 while using a fraction of the compute per token. It demonstrated that clever architecture could compete with brute-force scaling.
Mistral AI
Open SourceThe Agentic Era
DeepSeek R1: Open-Source Reasoning
Chinese AI lab DeepSeek released R1, an openly released reasoning model that approached OpenAI's o1-class performance at a fraction of the cost. Trained with reportedly modest compute budgets, it challenged the assumption that frontier reasoning required the largest Western-scale investment programs.
DeepSeek