The collapse of complex societies14 June 2025Those who criticize so-called “AI doomers” often overlook that there is a broader, intellectually serious tradition of technological doomerism that goes back decades. To revisit these works is to wonder whether AI really presents new risks, or if it is simply the manifestation of a risk previously foretold.Of course, predictions of global apocalypse are as ancient as humanity. Given the historical track record—no global apocalypse yet!—those predicting apocalypse have traditionally had a rough time being taken seriously. Still, there is a big difference between predicting the arrival of apocalypse ex nihilo vs. a reasoned argument that it necessarily emerges from specific human decisions and habits.This book, published in 1972, was an early effort to quantitatively model the effects of technological change. I read it some years ago. As the title implies, The Limits of Growth considered how five key global factors would affect human development:If the present growth trends in world population, industrialization, pollution, food production, and resource depletion continue unchanged, the limits to growth on this planet will be reached sometime within the next one hundred years. The most probable result will be a rather sudden and uncontrollable decline in both population and industrial capacity.In support of their theories, and unusually for the era, the authors of TLoG relied extensively on software simulations. This allowed the authors to, say, model the course of human progress under realistic current assumptions. In this model, population decreases rapidly after around 2050:But using a software model also allowed the authors to run simulations with different starting assumptions, such as this one with natural resources doubled. Perhaps counterintuitively, the model predicts that increasing resources causes the big drop in population to happen so...
First seen: 2025-06-17 15:14
Last seen: 2025-06-17 15:14