This article on Astral Codex Ten discusses the extinction tournament, which
brought together AI domain experts and super forecasters to estimate the
risk of human extinction by AI or other causes. They and Metaculus estimate
roughly 1% to 5% chance of extinction by 2100, with AI being the highest
risk, followed by nuclear war and genetically engineered pathogens. There
is a higher risk of catastrophic but non extinction events. Also, we are
nearly certain to have AGI by 2100.

https://astralcodexten.substack.com/p/the-extinction-tournament

I mostly agree with the risk estimates. What about you?

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T91a503beb3f94ef7-M0056da5fb89045e5ce487a01
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to