This topic has been discussed in this list for several times.

A previous post of mine can be found at
http://www.listbox.com/member/archive/303/2007/10/sort/time_rev/page/13/entry/22

Pei

On Mon, Aug 4, 2008 at 2:55 PM, Harry Chesley <[EMAIL PROTECTED]> wrote:
> As I've come out of the closet over the list tone issues, I guess I should
> post something AI-related as well -- at least that will make me net neutral
> between relevant and irrelevant postings. :-)
>
> One of the classic current AI issues is grounding, the argument being that a
> dictionary cannot be complete because it is only self-referential, and *has*
> to be grounded at some point to be truly meaningful. This argument is used
> to claim that abstract AI can never succeed, and that there must be a
> physical component of the AI that connects it to reality.
>
> I have never bought this line of reasoning. It seems to me that meaning is a
> layered thing, and that you can do perfectly good reasoning at one (or two
> or three) levels in the layering, without having to go "all the way down."
> And if that layering turns out to be circular (as it is in a dictionary in
> the pure sense), that in no way invalidates the reasoning done.
>
> My own AI work makes no attempt at grounding, so I'm really hoping I'm right
> here.
>
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=108809214-a0d121
Powered by Listbox: http://www.listbox.com

Reply via email to