Civilization Self-Destruction: Calculate the Probability

calculator intermediate ~6 min
Loading simulation...
Probability of surviving 1,000 years: ~44%

With moderate risk estimates (10% nuclear, 5% AI, 3% bio, 5% climate, 1% nano per century), the probability of surviving 10 centuries is approximately 44%. This represents a serious filter.

Formula

P(survive) = (1 - P_nuke)(1 - P_ai)(1 - P_bio)(1 - P_climate)(1 - P_nano)
P(survive T centuries) = P(survive)^T

The Great Filter Is Us?

Perhaps the answer to the Fermi Paradox is simple and terrifying: technological civilizations usually destroy themselves. Nuclear weapons, artificial intelligence, biological weapons, climate change — each of these risks may seem small over a single century, but combined and compounded over millennia, they add up to near-certain extinction.

This calculator models exactly that compounding effect. Each risk category has a per-century probability of causing civilizational collapse. The model assumes these risks are independent (a simplification — in reality they may interact). The key insight is multiplicative: even if each individual risk is modest, the combined probability of surviving all of them over many centuries drops precipitously.

The Mathematics of Cumulative Risk

Consider the default parameters: a 10% chance of nuclear catastrophe, 5% for AI, 3% for bioweapons, 5% for climate collapse, and 1% for nanotechnology — all per century. The probability of surviving any single century is about 78%. That sounds manageable. But over 10 centuries (1,000 years), the survival probability drops to roughly 44%. Over 50 centuries (5,000 years), it plummets to under 2%. And interstellar colonization likely requires civilizations to survive for thousands of years at minimum.

Interpreting the Results

Adjust the sliders to explore different risk profiles. Try setting all risks to their minimum values and see how long a civilization can realistically survive. Then try more pessimistic estimates. The exercise reveals why the parameter L in the Drake Equation — the lifetime of a technological civilization — may be the single most important variable in determining whether the Galaxy is full of life or eerily empty. If L is short, the Fermi Paradox dissolves: civilizations simply do not last long enough to be detected or to colonize the stars.

FAQ

What is the probability of humanity's self-destruction?

According to various estimates, the probability of a global catastrophe in the next 100 years ranges from 10% to 30%. The main risks are nuclear war, uncontrolled AI, biological weapons, and climate change.

Why is this related to the Fermi Paradox?

If technological civilizations typically self-destruct within the first 1,000 years after developing nuclear weapons or AI, this explains the absence of contact — most simply don't survive long enough to achieve interstellar travel.

What is the most serious threat?

According to Toby Ord (in his book 'The Precipice'), the greatest existential risk in the 21st century comes from uncontrolled AI (~10%) and engineered pandemics (~3%).

Sources

Embed

<iframe src="https://homo-deus.com/lab/fermi-paradox/self-destruction/embed" width="100%" height="400" frameborder="0"></iframe>
View source on GitHub