FLI announcement. https://futureoflife.org/ai/six-month-letter-expires/
Eliezer Yudkowsky is conspicuously absent among the signatories.

After I reluctantly signed the letter calling for a 6 month pause in AI
development, I did some calculations and realized that at the current rate
of Moore's law, we still have about a century before the exponential growth
of computing power surpasses DNA based life.

Meanwhile, I think the greatest threat is social isolation as we prefer AI
to human interaction. But I think this should be a personal choice, not a
reason for a ban. Evolution will solve the problem, maybe not in the way we
prefer.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T95d92c4766a58e8e-M042af5a66dcc859c40b7e4b3
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to