Gemini 1.5 Pro: Million-Token Context
What Happened
Google released Gemini 1.5 Pro with a 1 million token context window (later extended to 2M) — able to process entire codebases, books, or hours of video in a single prompt. It could find a needle in a haystack across millions of tokens with near-perfect recall.
Why It Mattered
Redefined what was possible with context length. Processing entire codebases or lengthy documents in one go opened entirely new use cases for AI assistants.
Organizations
Part of the Generative AI Revolution (2022–2024) era · Browse all research breakthroughs · View all 2024 milestones