".
How is Keogh's measure going to help you with that?
The problem is that Keogh's measure is intended
for data-mining where you have separate instances, not one big entwined
Gordian knot.
>> Now if only we had some test to tell which compressors have the best
misspell a word. Who is smart and who is dumb? -- Matt Mahoney, [EMAIL PROTECTED]- Original Message From: Mark Waser <[EMAIL PROTECTED]>To: agi@v2.listbox.comSent: Wednesday, August 16, 2006 9:17:52 AMSubject: Re: Mahoney/Sampo: [agi] Marcus Hutter's lossless compression o
essary for
AI (and adds a lot of needless complexity) . . . .
So why are combining the two?
- Original Message -
From:
Matt
Mahoney
To: agi@v2.listbox.com
Sent: Tuesday, August 15, 2006 6:02
PM
Subject: Re: Mahoney/Sampo: [agi] Marcus
Hutter's lossless compre
e
models...
Huh? By definition, the compressor with the best
language model is the one with the highest compression ratio.
- Original Message -
From:
Matt
Mahoney
To: agi@v2.listbox.com
Sent: Tuesday, August 15, 2006 3:54
PM
Subject: Re: Mahoney/Sampo: [agi] Marcus
y
To: agi@v2.listbox.com
Sent: Tuesday, August 15, 2006 3:54
PM
Subject: Re: Mahoney/Sampo: [agi] Marcus
Hutter's lossless compression of human knowledge prize
You
could use Keogh's compression dissimilarity measure to test for
inconsistency.http://www.cs.ucr.edu/~eam
On 8/15/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
I realize it is tempting to use lossy text compression as a test for AI
because that is what the human brain does when we read text and recall it in
paraphrased fashion. We remember the ideas and discard details about the
expression of those
uage models... -- Matt Mahoney, [EMAIL PROTECTED]- Original Message From: Mark Waser <[EMAIL PROTECTED]>To: agi@v2.listbox.comSent: Tuesday, August 15, 2006 3:22:10 PMSubject: Re: Mahoney/Sampo: [agi] Marcus Hutter's lossless compression of human knowledge prize
>> Could
y
s1), p(s2), so
it will be distinguishable from human even if the compression ratio is
ideal. -- Matt Mahoney, [EMAIL PROTECTED]
-
Original Message From: Mark Waser
<[EMAIL PROTECTED]>To: agi@v2.listbox.comSent: Tuesday,
August 15, 2006 9:28:26 AMSubje
implying are different things.-- Matt Mahoney, [EMAIL PROTECTED]- Original Message From: Mark Waser <[EMAIL PROTECTED]>To: agi@v2.listbox.comSent: Tuesday, August 15, 2006 12:55:24 PMSubject: Re: Mahoney/Sampo: [agi] Marcus Hutter's lossless compression of human knowledge prize
Mahoney
To: agi@v2.listbox.com
Sent: Tuesday, August 15, 2006 12:27
PM
Subject: Re: Mahoney/Sampo: [agi] Marcus
Hutter's lossless compression of human knowledge prize
I realize it is tempting to use lossy text compression as a test for AI
because that is what the human brai
ven if the compression ratio is ideal. -- Matt Mahoney, [EMAIL PROTECTED]- Original Message From: Mark Waser <[EMAIL PROTECTED]>To: agi@v2.listbox.comSent: Tuesday, August 15, 2006 9:28:26 AMSubject: Re: Mahoney/Sampo: [agi] Marcus Hutter's lossless compression of human knowledge prize
>> I
don't see any point in this debate over lossless vs. lossy
compression
Lets see if I can simplify it.
The stated goal is compressing human
knowledge.
The exact, same knowledge can always be expressed
in a *VERY* large number of different bit strings
Not being able to reprod
12 matches
Mail list logo