Alec Radford
2 milestones · 2018–2019
Explore Alec Radford's contributions to AI across 2 milestones from 2018 to 2019, with the strongest concentration in research breakthroughs.
Chronology
Research
GPT-1: Generative Pre-training
OpenAI released GPT-1, demonstrating that a Transformer trained on vast amounts of text using unsupervised pre-training could then be fine-tuned for specific NLP tasks. With 117 million parameters, it showed the potential of scaling language models.
Alec RadfordOpenAI
Research
GPT-2: 'Too Dangerous to Release'
OpenAI announced GPT-2 (1.5 billion parameters) but initially refused to release the full model, calling it 'too dangerous' due to its ability to generate convincing fake text. The decision was controversial — some praised the caution, others called it a publicity stunt. The full model was eventually released in November 2019.
Alec RadfordOpenAI