GPT-3: The 175 Billion Parameter Leap
What Happened
OpenAI released GPT-3 with 175 billion parameters — 100x larger than GPT-2. Without any fine-tuning, GPT-3 could write essays, code, poetry, translate languages, and answer questions through 'few-shot learning' (learning from just a few examples in the prompt). The API launched in beta, enabling thousands of applications.
Why It Mattered
A paradigm shift. GPT-3 showed that sheer scale could produce emergent capabilities no one predicted. It spawned an entire ecosystem of AI startups building on the API. The 'scaling hypothesis' — that bigger models would get smarter — gained enormous credibility.
Key People
Organizations
Part of the The Transformer Era (2018–2021) era · Browse all research breakthroughs · View all 2020 milestones