How AI researchers accidentally discovered that everything they thought about learning was wrong 18 Aug, 2025 The lottery ticket hypothesis explains why massive neural networks succeed despite centuries of theory predicting they should fail Five years ago, suggesting that AI researchers train neural networks with trillions of parameters would have earned you pitying looks. It violated the most fundamental rule in machine learning: make your model too large, and it becomes a glorified photocopier, memorising training data whilst learning nothing useful. This wasn't mere convention—it was mathematical law, backed by three centuries of statistical theory. Every textbook showed the same inexorable curve: small models underfit, optimal models generalise, large models catastrophically overfit. End of story. Yet today, those "impossible" massive models power ChatGPT, decode proteins, and have triggered a global arms race worth hundreds of billions. What changed wasn't just computing power—it was our understanding of learning itself. The story behind this transformation reveals how the biggest breakthrough in AI emerged from researchers bold enough to ignore their own field's foundational assumptions. The iron law that ruled machine learningFor over 300 years, one principle governed every learning system: the bias-variance tradeoff. The mathematics was elegant, the logic unassailable. Build a model too simple, and it misses crucial patterns. Build it too complex, and it memorises noise instead of signals. Picture a student learning arithmetic. Show them thousands of addition problems with answers, and they might learn two ways. The intelligent approach: grasp the underlying algorithm of carrying digits and place values. The foolish approach: memorise every single example. The second strategy delivers perfect scores on homework but complete failure on the exam. Neural networks seemed especially vulnerable to this memorisation trap. With millions of parameters, they could eas...
First seen: 2025-08-18 19:42
Last seen: 2025-08-19 00:48