Word2Vec: Words as Vectors
What Happened
Google researchers published Word2Vec, showing that relatively small neural networks could efficiently learn meaningful vector representations of words from large text corpora. The famous example `king - man + woman ≈ queen` made the idea vivid: semantic relationships could be captured geometrically in vector space.
Why It Mattered
Made word embeddings practical at scale and gave NLP a shared representation layer that was easy to train, reuse, and reason about. It helped bridge older statistical NLP and the neural language-model era that eventually led to transformers.
Key People
Organizations
Part of the Deep Learning Breakthrough (2012–2017) era · Browse all research breakthroughs · View all 2013 milestones
