Transformer
2 milestones in AI history
ResearchDeep Learning Breakthrough
Attention Is All You Need: The Transformer
Eight researchers at Google published 'Attention Is All You Need,' introducing the Transformer architecture. It replaced recurrence with self-attention mechanisms that could process entire sequences in parallel. The paper's title was deliberately bold — and proved prescient.
Ashish VaswaniNoam ShazeerGoogle BrainGoogle Research
ResearchThe Transformer Era
BERT: Bidirectional Language Understanding
Google published BERT (Bidirectional Encoder Representations from Transformers), which could understand language context from both directions simultaneously. BERT shattered records on 11 NLP benchmarks. Google integrated it into Search, affecting 10% of all queries.
Jacob DevlinGoogle AI