Pre Training
2 milestones in AI history
ResearchThe Transformer Era
BERT: Bidirectional Language Understanding
Google published BERT (Bidirectional Encoder Representations from Transformers), which could understand language context from both directions simultaneously. BERT shattered records on 11 NLP benchmarks. Google integrated it into Search, affecting 10% of all queries.
Jacob DevlinGoogle AI
ResearchThe Transformer Era
GPT-1: Generative Pre-training
OpenAI released GPT-1, demonstrating that a Transformer trained on vast amounts of text using unsupervised pre-training could then be fine-tuned for specific NLP tasks. With 117 million parameters, it showed the potential of scaling language models.
Alec RadfordOpenAI