I think all this "singularity" nonsense can be traced back to Heinz von
Foerster's impish sense of humor
<https://ui.adsabs.harvard.edu/abs/1960Sci...132.1291V/abstract> -- at
least that's how he struck me when I took a 1974 "second order cybernetics"
summer class from him about the time it was obvious the world population
was departing from the asymptotic curve.



On Tue, Nov 19, 2019 at 7:21 PM TimTyler <[email protected]> wrote:

> On 2019-11-18 12:45:PM, Matt Mahoney wrote:
> > The premise of the Singularity is that if humans can create smarter
> > than human intelligence (meaning faster or more successful at
> > achieving goals), then so can it, only faster. That will lead to an
> > intelligence explosion because each iteration will be faster. [...]
> > The future may be fantastic and unimaginable. But we already know that
> > physics doesn't allow a singularity.
> 
> Yes. I drew similar conclusions long ago in:
> 
> https://alife.co.uk/essays/the_singularity_is_nonsense/
> 
> --
> __________
> |im |yler http://timtyler.org/
> 

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T94ee05730c7d4074-Mbc61761ba5ec69906217caf7
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to