Scaling
2 milestones in AI history
ResearchThe Transformer Era
GPT-3: The 175 Billion Parameter Leap
OpenAI released GPT-3 with 175 billion parameters — 100x larger than GPT-2. Without any fine-tuning, GPT-3 could write essays, code, poetry, translate languages, and answer questions through 'few-shot learning' (learning from just a few examples in the prompt). The API launched in beta, enabling thousands of applications.
Tom BrownOpenAI
ResearchGenerative AI Revolution
Gemini 1.5 Pro: Million-Token Context
Google released Gemini 1.5 Pro with a 1 million token context window (later extended to 2M) — able to process entire codebases, books, or hours of video in a single prompt. It could find a needle in a haystack across millions of tokens with near-perfect recall.
Google DeepMind