source: The Vulnerable World Hypothesis, by Nick Bostrom


vulnerable world — a world where there is a technology almost certainly destroys civilization. Makes us reconsider:

  • ubiquitous surveillance
  • unipolar world order

The Urn and the Black Ball

Developing a technology is like pulling balls out of an urn. The majority of them are “white”, or benign. A black ball is a technology that will destroy civilization. We haven’t pulled any black balls out the urn yet

  • you can’t “put a ball back into the urn”

easy nukes

Leo Szilard - Physicist. Tried to persuade other physicists not to develop nukes because nukes are dangerous. Scientific knowledge will make dangerous insights more accessible

easy nukes - what if all you needed to make a nuke was to electrify a wire between two panes of glass. Someone would use it to make a nuke and attack a city. Would we need to get rid of glass? Wire? Electricity? What would we do to people who we found with glass? Would we need to be an authoritarian society to prevent people from getting glass?

Vulnerable world hypothesis

If technological development continues, then at some point we’ll develop a technology that will make it extremely likely that we destroy civilization

semi-anarchic default condition - a world with

  1. insufficient preventative policing
  2. insufficient global governance
  3. diverse motivations

existential risk vulnerable world hypothesis - the technology would by default destroy life on earth or destroy our potential for future value

  • when would the risk and the catastrophe justify government surveillance

Vulnerable periods can open and close. They’d close when we develop a new protective technology that could stop the dangerous tech. A world can also be stabilized.

Types of Vulnerabilities

Type-1: easy nukes

tech that’s destructive and easy to use. If we have a semi-anarchic condition, they’ll probably destroy civilization

We should consider ease and destructiveness of different technologies

Another factor is the diversity of motivations within a population. Most people don’t want to use nukes on anyone, but you only need one

Type-2a: safe first strike

Tech that incentivizes powerful actors to cause destruction (like during the cold war. One of the things that saved us during the cold war was the ability of the superpowers to strike back after an initial attack)

  • If there’s a case where there’s a benefit to nuclear aggression, nuclear war becomes more likely

Type-2b: worse global warming

some tech where many actors (like, half of people on earth) are incentivized to take a slightly damaging action. The combined effect of all these actions becomes a problem that could devastate civilization

  • what if, for instance, we didn’t know that global warming was happening until it was too late, or if there were multiple positive feedback loops

Type-0: surprising strangelets

Tech that carries a hidden risk, so we destroy civilization on accident

  • what if nukes set the atmosphere on fire and we all died? What if a high energy physics experiment caused a black hole and sucked in the planet?