Marc Walser wrote

Try to get the name right.  It's just common competence and courtesy.

Before you ask for counter examples you should *first* give some arguments which supports your hypothesis. This was my point.

And I believe that I did. And I note that you didn't even address the fact that I did so again in the e-mail you are quoting. You seem to want to address trivia rather than the meat of the argument. What don't you address the core instead of throwing up a smokescreen?

Regarding your example with Darwin:

What example with Darwin?

First, I'd appreciate it if you'd drop the strawman. You are the only one who keeps insisting that anything is "easy".
Is this a scientific discussion from you? No. You use rhetoric and nothing else.

And baseless statements like "You use rhetoric and nothing else" are a scientific discussion. Again with the smokescreen.

I don't say that anything is easy.

Direct quote cut and paste from *your* e-mail . . . .
------------------------------------------------------
From: Dr. Matthias Heger
To: [email protected]
Sent: Sunday, October 19, 2008 2:19 PM
Subject: AW: AW: [agi] Re: Defining AGI


The process of translating patterns into language should be easier than the process of creating patterns or manipulating patterns. Therefore I say that language understanding is easy.

------------------------------------------------------------------





Clearly you DO say that language understanding is easy.







This is the first time you speak about pre-requisites.

Direct quote cut and paste from *my* e-mail . . . . .
--------------------------------------------------------
----- Original Message ----- From: "Mark Waser" <[EMAIL PROTECTED]>
To: <[email protected]>
Sent: Sunday, October 19, 2008 4:01 PM
Subject: Re: AW: AW: [agi] Re: Defining AGI


I don't think that learning of language is the entire point. If I have
only
learned language I still cannot create anything. A human who can
understand
language is by far still no good scientist. Intelligence means the ability
to solve problems. Which problems can a system solve if it can nothing
else
than language understanding?

Many or most people on this list believe that learning language is an
AGI-complete task.  What this means is that the skills necessary for
learning a language are necessary and sufficient for learning any other
task.  It is not that language understanding gives general intelligence
capabilities, but that the pre-requisites for language understanding are
general intelligence (or, that language understanding is isomorphic to
general intelligence in the same fashion that all NP-complete problems are
isomorphic).  Thus, the argument actually is that a system that "can do
nothing else than language understanding" is an oxymoron.

-----------------------------------------------------------------------------




Clearly I DO talk about the pre-requisites for language understanding.




====================================================

Dude.  Seriously.

First you deny your own statements and then claim that I didn't previously mention something that it is easily provable that I did (at the top of an e-mail). Check the archives. It's all there in bits and bytes.

Then you end with a funky pseudo-definition that "Understanding does not imply the ability to create something new or to apply knowledge." What *does* understanding mean if you can't apply it? What value does it have?




-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to