Ben G wrote

>>>
Well, for the purpose of creating the first human-level AGI, it seems
important **to** wire in humanlike bias about space and time ... this will
greatly ease the task of teaching the system to use our language and
communicate with us effectively...

But I agree that not **all** AGIs should have this inbuilt biasing ... for
instance an AGI hooked directly to quantum microworld sensors could become a
kind of "quantum mind" with a totally different intuition for the physical
world than we have...
<<<

 

Ok. But then I have again a different understanding of the G in AGI. The
"quantum mind" should be more general than the human level AGI.

But since the human level AGI is difficult enough, we should build it first.


 

After that, for AGI 2.0, I propose the goal to build a quantum mind. ;-)






-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to