Vladimir Nesov wrote:
 It's too fuzzy an argument.

You're right, of course. I'm not being precise, and though I'll try to improve on that here, I probably still won't be. But here's my attempt:

There are essentially three types of grounding: embodiment, hierarchy base nodes, and pattern/data sources.

Embodiment: I don't think AIs need to have a connection to a real or simulated environment. Yes, we get a lot of our information that way, and yes, human meaning/understanding probably evolved out of that connection initially. But no, AI doesn't require it to do useful thinking.

Hierarchy base nodes: I don't think a hierarchy of concepts or a semantic network need to have a set of base nodes that connect to something outside the system ("primitives"). Meaning arises out of the network of connections, and doesn't need some basic unit of meaningful nodes.

Data sources: But there is one sense in which a system must be grounded to provide useful results for the real world. The connections between concepts, and the statistics regarding those connections need to come from the real world. A dictionary is circular, but the connections between the nodes are set based on corresponding connections in the external world. Or to put that another way, you can't reason about something you have no data about.

It was the first two senses that I meant when I said an AI doesn't need to be grounded.

P.S. You can think of embodiment as being an example of hierarchy base nodes; or you can think of it as a data source. In the latter case, it can be useful (as other have pointed out on the list), but isn't necessary.



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=108809214-a0d121
Powered by Listbox: http://www.listbox.com

Reply via email to