Mike Tintner wrote:
Brad;We don't need no stinkin' grounding.

Your intention, I take it, is partly humorous. You are self-consciously assuming the persona of an angry child/adolescent. How do I know that without being grounded in real world conversations? How can you understand the prosody of language generally without being grounded in conversations? How can I otherwise know that you do not literally mean that grounding "stinks" like a carcase? What language does not have prosody?

Mike,

Partly humorous, partly to make the point clear (in which effort humor sometimes helps). I don't know what you mean by "self-consciously" but you really have no idea what I was thinking when I wrote that. You can guess. Which you did. And, you can guess wrong. Which you did. So, even though your interpretation of what I said comes from a "grounded" human, it didn't help you much in this case. Oh, and the willful child stuff was used in a clearly-demarcated simile in a previous paragraph. That simile had to do with "learning by asking." It clearly does not apply (except as part of the backing argument) to the above statement.

As for prosody being a factor in understanding language, we're communicating using language right now, via e-mail, where prosodic clues are practically non-existent. We're not agreeing, but that doesn't mean we're both not understanding. Even if you did manage to build an AGI that could use prosody like a human, it would run into the same prosody-related issues that attend all written communication.

Mostly, my post was not about your (or any human's) symbol grounding capabilities or lack thereof. I was simply pointing out in that post that a human-compatible AGI can become grounded enough through description. Again and again, in your reply you refer to yourself (e.g., "How do I know that without being grounded in real world conversations?"). My response to that is "Why should I care?" Unless, that is, you are, implicitly, suggesting an AGI must be Turing-indistinguishable from a human (like you) when it comes to being "grounded." If that's a correct inference on my part, then I disagree.

I think uncritical acceptance of Turing's test is one of the worst mistakes made in modern science by really smart people. And, in the remainder of your reply, you do more than I ever could have alone to make my point for me.

How am I able to proceed to the following analysis of your sentence without grounding? .."Brad's sentence reveals a fascinating example of the workings of the unconscious mind. He has assumed in one sentence the persona of a wilful child. In effect, his unconscious mind is commenting on his conscious position : "I know that I am being wilful in demanding that AGI be conducted purely in language/symbols - demanding like a child that the world conform to my wishes and convenience (because, frankly, I only know how to do AI that is language- and symbol-based, and having to learn new sign systems would be jolly inconvenient, and I'm too lazy, so there)".

Congratulations. You're a human whose understanding of the world is supposed to be the "gold standard" for AGI, grounded-by-experience knowledge and you totally misinterpreted what I wrote. I never "assumed the persona of a willful child." The reference to the child was a simile and that (should have been) plain by my use of the word "like" in the description. My human-compatible AGI would have very quickly understood the remainder of that sentence to be a simile and have interpreted it (unlike yourself) correctly. Moreover, that simile extended only to the way an AGI might learn by description. By definition, I can't say it had nothing to do with my subconscious but, then, neither (especially) can you.

There's a lot more to language than meets the eye - or could ever meet the eye of a non-grounded AGI.

Maybe, but you have yet to convince me a human-compatible AGI would need it.

P.S. I would suggest as a matter of practice here that anyone who wants to argue a position should ALWAYS PROVIDE AN EXAMPLE OR TWO - of say a sentence or even a single word that they think can be understood with or without grounding. (Sorry Bob M., I think that's worth shouting about). Argument without examples here should be regarded as shoddy, inferior intellectual practice.

Suggest all you want. I think I am a very clear and concise writer. So do other people. I've won awards for it. I do use examples when I feel they will help others better understand what I'm writing about. I think I probably want people to understand what I write more than you do. I confess to assuming a certain level of knowledge about AI/AGI on the part of my audience when I write for this list. I also expect them to be facile with Google and Wikipedia should I fall short. If that's not good enough for you, just refrain from replying to my posts. Or, now here's a radical idea: ask me for clarification *before* launching into a critique based on your misunderstanding.

P.P.S. A possible further example of the workings of the unconscious mind. Is it possible that your sentence has an echo - *in your mind* - of Pink Floyd's Another Brick in the Wall with "We don't need no education" ?

Nope. Wrong again. At least you're consistent. That line actually comes from a Cheech and Chong skit (or a movie -- can't remember which at the moment) where the guys are trying to get information by posing as cops. At least I think that's the setup. When the person they're attempting to question asks to see their badges, Cheech replies, "Badges? We don't need no stinking badges!"

Having been a young adult in the 1960's and 1970's, I am, of course, a long-time Pink Floyd fan. In fact, one of my Pandora (http://www.pandora.com) stations is set up so that I hear something by PF at least once a week.


Brad

-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: https://www.listbox.com/member/?&;
Powered by Listbox: http://www.listbox.com



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=108809214-a0d121
Powered by Listbox: http://www.listbox.com

Reply via email to