----- Original Message ----- 
From: "Matt Mahoney" <[EMAIL PROTECTED]>
To: <agi@v2.listbox.com>
Sent: Tuesday, April 17, 2007 7:15 PM
Subject: Goals of AGI (was Re: [agi] AGI interests)


> In http://cs.fit.edu/~mmahoney/compression/rationale.html I argue the
> equivalence of text compression with AI.

I read your article and I am even more convinced the above conclusion is
wrong.  I think you assume that humans maintain some huge database of common
sense facts that creates their intelligence.  I think information is
important and necessary for intelligence but definitely not sufficient.  If
I understand a concept, I will generate facts based on that understanding.
Just because I have a lot of facts about a domain, if I don't *understand*
the relationships of those facts, I won't be able to produce any facts that
I haven't memorized.  If a mass of information is accumulated, I fail to see
how that information can be called intelligent.  Google is a perfect example
of huge information and no intelligence.

If information by itself can't create intelligence, then compression of that
information won't either.  Citing the Turing test, Loebner prize etc means
nothing to the proof that information alone, creates intelligence.  These
tests and/or prizes show nothing to me of intelligence.  A chatbot called
Alice matches input phrases with responses.  If this database was big
enough, it would look like Alice had some intelligence when if you look at
the program, you can see this is not true.  In Alice's case, if any
intelligence is seen, it is just the stored intelligence of the human that
input the text and has nothing to do with Alice.  If it walks like a duck
and talks like a duck, it might not be a duck after all, but just a
charlatan's trick!

To prove that compression is equivalent to AGI you would have to prove that
information alone can create intelligence.

You state "Given a probability distribution P over strings s, the Shannon
capacity theorem states that the optimal (shortest average) code for s has
length log2 1/P(s) bits.".  My words are not just some probability
distribution, they actually mean something.  They have purpose and reason.
Optimal code doesn't mean intelligence.  I might solve a problem in a very
round about way but the fact I got a workable solution is sufficient to
describe that process as intelligent.  If the problem is "one of", then any
further compression or optimization is a waste of time.  All programmers
know this simple fact.  Optimization is only seldom useful depending on the
frequency the code will be executed.  The quickest and best "code" I ever
wrote were the programs requested by my customers that were deemed
unnecessary after looking at the problem and deciding the existing code
would do!  Can any compression be better than that?

I don't deny that text compression is hard but what is it's conection to
AGI?  Estimating the total information capacity of a human is also
irrelevant to intelligence.  The capacity of humans to store information is
approximately equal across the population but the intelligence of humans is
hugely *unequal*.  Without a quantity of knowledge a person wouldn't be
considered intelligent, but more importantly, the *quality* of that
knowledge and understanding, predicts the resulting intelligence.

If a huge statistical database of valid English information could be parsed
(compressed or otherwise), it might be possible to predict with some
accuracy if a given sentence was likely to be grammatically correct or not.
This capability seems far removed from an AGI IMHO.

If a book is put in a computer and then I refer to that book by it's title
1M times, what is my percentage compression?  If you think it is high then
show me where the intelligence lies in this reference?  By using simple
references, humans compress huge amounts of data that would consume storage
our brains couldn't physically handle.  The problem is that this kind of
compression is accomplished by understanding.  If you can crack the
*understanding* part of compression then you might have an AGI but I fail to
see how just compressing data will result in understanding.  Compression and
understanding are not reciprocal concepts.  If humans had unlimited storage
and compression of information wasn't necessary, wouldn't the humans
*understanding* still confer intelligence to that human?

-- David Clark


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to