TensorFlow machine learning framework logo

TensorFlow Open-Sourced

What Happened

Google open-sourced TensorFlow, its internal machine learning framework. This gave every researcher and developer access to the same tools Google used internally. PyTorch (Facebook, 2016) followed, creating a healthy competition that accelerated the entire field.

Why It Mattered

Democratized deep learning. Anyone with a laptop could now build state-of-the-art models. Accelerated AI research exponentially by removing infrastructure barriers.

Key People

Organizations

Tags

Related Milestones

The Transformer model architecture diagram from Attention Is All You Need
Research

Attention Is All You Need: The Transformer

Eight researchers at Google published 'Attention Is All You Need,' introducing the Transformer architecture. It replaced recurrence with self-attention mechanisms that could process entire sequences in parallel. The paper's title was deliberately bold — and proved prescient.

Ashish VaswaniNoam ShazeerGoogle BrainGoogle Research
Astronaut riding a horse, iconic Stable Diffusion generated image
Open Source

Stable Diffusion: Open-Source Image Generation

Stable Diffusion was released as a widely available text-to-image model that could run on consumer hardware, with model weights distributed under an open release rather than an API-only product. Unlike DALL-E, anyone could download it, run it locally, and build on top of it. An explosion of community modifications, fine-tunes, and applications followed.

Emad MostaqueStability AICompVis (LMU Munich)
Meta AI logo
Open Source

Llama 2: Meta Opens the Floodgates

Meta released Llama 2, a family of widely available large language models (7B, 13B, 70B parameters) distributed as open weights under a custom license that allowed broad commercial use. While not open-source in the strict OSI sense, it gave companies and researchers access to a frontier-quality model they could run, customize, and deploy themselves.

Mark ZuckerbergMeta
Mistral AI logo, creators of Mixtral
Open Source

Mixtral 8x7B: Efficient Mixture of Experts

French startup Mistral AI released Mixtral 8x7B, a mixture-of-experts model that matched or beat GPT-3.5 while using a fraction of the compute per token. It demonstrated that clever architecture could compete with brute-force scaling.

Mistral AI
Meta AI logo
Open Source

Llama 3: Open-Source Catches Up

Meta released Llama 3 (8B and 70B, later 405B), closing the gap with closed frontier models. The 405B release put near-frontier open-weight models into more developers' hands, even though Meta's licensing still sat outside a strict open-source definition.

Meta

Get the latest AI milestones as they happen

Join the newsletter. No spam, just signal.