BERT: Bidirectional Language Understanding
What Happened
Google published BERT (Bidirectional Encoder Representations from Transformers), which could understand language context from both directions simultaneously. BERT shattered records on 11 NLP benchmarks. Google integrated it into Search, affecting 10% of all queries.
Why It Mattered
Transformed NLP overnight. Pre-training plus fine-tuning became the dominant paradigm, and BERT showed how transformer-based language models could capture context with a depth that reset expectations for search and language understanding.
Key People
Organizations
Part of the The Transformer Era (2018–2021) era · Browse all research breakthroughs · View all 2018 milestones