PM: One should seriously take a look at Apple's SIRI since a system like that 
may evolve into an AGI if it is 
equipped with sufficient back end services (i.e., actions).  
This is an absolutely fantastic (but probably commonplace) delusion. It’s like 
saying – a book (and a book can be organized to function like a program) can 
evolve into a human being (or an animal).
An infant can see the massive differences between a book and a human being, but 
an awful lot of AGI-ers can’t.


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to