ResearchDeep Learning Breakthrough Word2Vec: Words as Vectors
Google researchers published Word2Vec, showing that relatively small neural networks could efficiently learn meaningful vector representations of words from large text corpora. The famous example `king - man + woman ≈ queen` made the idea vivid: semantic relationships could be captured geometrically in vector space.
Tomas MikolovGoogle