LSTM recurrent neural network cell diagram

Long Short-Term Memory (LSTM)

What Happened

Hochreiter and Schmidhuber published the LSTM architecture, solving the vanishing gradient problem that plagued recurrent neural networks. LSTMs could learn long-range dependencies in sequential data by maintaining a memory cell with gates that controlled information flow.

Why It Mattered

Became the most important recurrent architecture for two decades. Powered speech recognition, machine translation, and text generation until Transformers replaced them. The most-cited neural network paper of the 20th century.

Key People

Organizations

Tags

Related Milestones

Research

Support Vector Machines

Vapnik and Cortes published their work on Support Vector Machines (SVMs), a method for finding maximum-margin decision boundaries in high-dimensional spaces with unusually strong theoretical guarantees. SVMs quickly became one of the leading approaches for classification problems across text, vision, and bioinformatics.

Vladimir VapnikCorinna CortesAT&T Bell Labs
Reinforcement learning agent-environment interaction diagram
Research

TD-Gammon: Reinforcement Learning Plays Backgammon

Gerald Tesauro created TD-Gammon, a neural network that learned to play backgammon at expert level through self-play using temporal difference reinforcement learning. It discovered novel strategies that surprised human experts.

Gerald TesauroIBM
Artificial neural network diagram representing McCulloch-Pitts neuron model
Research

First Mathematical Model of Neural Networks

McCulloch and Pitts published 'A Logical Calculus of Ideas Immanent in Nervous Activity,' creating the first mathematical model of an artificial neuron. They showed that simple binary neurons connected in networks could, in principle, compute any function computable by a Turing machine.

Warren McCullochWalter PittsUniversity of Chicago
Frank Rosenblatt, inventor of the Perceptron
Research

The Perceptron

Frank Rosenblatt built the Mark I Perceptron, the first hardware implementation of an artificial neural network. It could learn to classify simple visual patterns. The New York Times reported it as an 'Electronic Brain' that the Navy expected would 'be able to walk, talk, see, write, reproduce itself and be conscious of its existence.'

Frank RosenblattCornell Aeronautical Laboratory
ELIZA chatbot conversation example
Research

ELIZA: The First Chatbot

Joseph Weizenbaum created ELIZA, a program that simulated a Rogerian psychotherapist using simple pattern matching. Despite being purely rule-based with no understanding, users became emotionally attached to it and insisted it truly understood them — a phenomenon Weizenbaum found deeply disturbing.

Joseph WeizenbaumMIT

Get the latest AI milestones as they happen

Join the newsletter. No spam, just signal.