OpenAI logo

OpenAI Founded

What Happened

OpenAI was founded as a non-profit AI research lab with $1 billion in committed funding, aiming to ensure artificial general intelligence benefits all of humanity. Co-founded by Sam Altman (Y Combinator president), Elon Musk, and top researchers including Ilya Sutskever from Google Brain.

Why It Mattered

Created the organization that would produce GPT-2, GPT-3, GPT-4, DALL-E, and ChatGPT — arguably the most influential AI company of the 2020s. The non-profit-to-capped-profit transition would later become one of tech's most controversial governance stories.

Key People

Organizations

Tags

Related Milestones

Anthropic AI safety company logo
Infrastructure

Anthropic Founded

Former OpenAI VP of Research Dario Amodei and his sister Daniela, along with several other OpenAI researchers, founded Anthropic — an AI safety company focused on building reliable, interpretable, and steerable AI systems.

Dario AmodeiDaniela AmodeiAnthropic
OpenAI logo
Research

GPT-1: Generative Pre-training

OpenAI released GPT-1, demonstrating that a Transformer trained on vast amounts of text using unsupervised pre-training could then be fine-tuned for specific NLP tasks. With 117 million parameters, it showed the potential of scaling language models.

Alec RadfordOpenAI
GPT-2 language model generating text about itself
Research

GPT-2: 'Too Dangerous to Release'

OpenAI announced GPT-2 (1.5 billion parameters) but initially refused to release the full model, calling it 'too dangerous' due to its ability to generate convincing fake text. The decision was controversial — some praised the caution, others called it a publicity stunt. The full model was eventually released in November 2019.

Alec RadfordOpenAI
OpenAI logo
Research

GPT-3: The 175 Billion Parameter Leap

OpenAI released GPT-3 with 175 billion parameters — 100x larger than GPT-2. Without any fine-tuning, GPT-3 could write essays, code, poetry, translate languages, and answer questions through 'few-shot learning' (learning from just a few examples in the prompt). The API launched in beta, enabling thousands of applications.

Tom BrownOpenAI
Tomáš Mikolov, lead author of Word2Vec
Research

Word2Vec: Words as Vectors

Google researchers published Word2Vec, showing that relatively small neural networks could efficiently learn meaningful vector representations of words from large text corpora. The famous example `king - man + woman ≈ queen` made the idea vivid: semantic relationships could be captured geometrically in vector space.

Tomas MikolovGoogle

Get the latest AI milestones as they happen

Join the newsletter. No spam, just signal.