Nick Bostrom on Artificial Intelligence and Existential Risks23:17

Nick Bostrom on Artificial Intelligence and Existential Risks

Existential risks ('x-risks') are those risks which either threaten our species with extinction or with a drastic and permanent curtailment of our potential. Risks that threaten 99% or the human species, or similar, are not existential risks. Only the risks that threaten to wipe out humankind (or some narrow scenarios involving our permanent crippling) qualify as 'existential risks'. 

List of RisksEdit

  • artificial intelligence
  • nanotech arms race
  • whole brain emulation
  • human intelligence enhancement
  • asteroid impact
  • supervolcano eruption

Organizations Concerned with X-RisksEdit

Ad blocker interference detected!

Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.