FANDOM


Nick Bostrom on Artificial Intelligence and Existential Risks

Nick Bostrom on Artificial Intelligence and Existential Risks

Existential risks ('x-risks') are those risks which either threaten our species with extinction or with a drastic and permanent curtailment of our potential. Risks that threaten 99% or the human species, or similar, are not existential risks. Only the risks that threaten to wipe out humankind (or some narrow scenarios involving our permanent crippling) qualify as 'existential risks'. 

List of RisksEdit

  • artificial intelligence
  • nanotech arms race
  • whole brain emulation
  • human intelligence enhancement
  • asteroid impact
  • supervolcano eruption

Organizations Concerned with X-RisksEdit