Backpropagation Rediscovered
What Happened
Rumelhart, Hinton, and Williams published 'Learning Representations by Back-propagating Errors' in Nature, demonstrating that backpropagation could train multi-layer neural networks effectively. The same year, the PDP (Parallel Distributed Processing) group published their influential two-volume work on connectionism.
Why It Mattered
Revived neural network research from its decade-long exile. Backpropagation became the standard training method for multi-layer neural networks and underpinned the deep learning wave that followed.
Organizations
Part of the Expert Systems Boom (1980–1987) era · Browse all research breakthroughs · View all 1986 milestones
