On 2019-11-18 12:45:PM, Matt Mahoney wrote:
The premise of the Singularity is that if humans can create smarter than human intelligence (meaning faster or more successful at achieving goals), then so can it, only faster. That will lead to an intelligence explosion because each iteration will be faster. [...] The future may be fantastic and unimaginable. But we already know that physics doesn't allow a singularity.

Yes. I drew similar conclusions long ago in:

https://alife.co.uk/essays/the_singularity_is_nonsense/

--
__________
 |im |yler http://timtyler.org/


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T94ee05730c7d4074-M8d6ed6b7bde03c415fb805d7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to