AIL PROTECTED]>
> To:
> Sent: Thursday, October 04, 2007 4:42 PM
> Subject: Re: [agi] Language and compression
>
>
> >
> > --- Mark Waser <[EMAIL PROTECTED]> wrote:
> >
> >> Matt Mahoney pontificated:
> >> > The probability distri
nt: Thursday, October 04, 2007 4:42 PM
Subject: Re: [agi] Language and compression
--- Mark Waser <[EMAIL PROTECTED]> wrote:
Matt Mahoney pontificated:
> The probability distribution of language
> coming out through the mouth is the same as the distribution coming in
> thr
--- Mark Waser <[EMAIL PROTECTED]> wrote:
> Matt Mahoney pontificated:
> > The probability distribution of language
> > coming out through the mouth is the same as the distribution coming in
> > through
> > the ears.
>
> Wrong.
Could you explain how they differ and why it would matter? Rememb
Matt Mahoney pontificated:
The probability distribution of language
coming out through the mouth is the same as the distribution coming in
through
the ears.
Wrong.
My goal is not to compress text but to be able to compute its probability
distribution. That problem is AI-hard.
Wrong again
--- Russell Wallace <[EMAIL PROTECTED]> wrote:
> On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > And text is the only data type with this property. Images, audio,
> executable
> > code, and seismic data can all be compressed with very little memory.
>
> How sure are we of that? Of course
On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> And text is the only data type with this property. Images, audio, executable
> code, and seismic data can all be compressed with very little memory.
How sure are we of that? Of course all those things _can_ be
compressed with very little memor
--- Russell Wallace <[EMAIL PROTECTED]> wrote:
> On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > Yes, but it has nothing to do with AI. You are modeling physics, a much
> > harder problem.
>
> Well, I think compression in general doesn't have much to do with AI,
> like I said before :) B
--- Vladimir Nesov <[EMAIL PROTECTED]> wrote:
> The same probably goes for text
> compression: clever (but not intelligent) statistics-gathering
> algorithm on texts can probably do a much better job for compressing
> than human-like intelligence which just chunks this information
> according to i
On 10/4/07, Vladimir Nesov <[EMAIL PROTECTED]> wrote:
> On 10/4/07, Russell Wallace <[EMAIL PROTECTED]> wrote:
> > Suppose 50% is the absolute max you can get - that's still worth
> > having, in cases where you don't want to throw away data.
>
> But why is it going to correlate with intelligence?
On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> Yes, but it has nothing to do with AI. You are modeling physics, a much
> harder problem.
Well, I think compression in general doesn't have much to do with AI,
like I said before :) But I'm surprised you call physics modeling a
harder problem,
On 10/4/07, Russell Wallace <[EMAIL PROTECTED]> wrote:
> On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > Lossless video compression would not get far. The brightness of a pixel
> > depends on the number of photons striking the corresponding CCD sensor. The
> > randomness due to quantum me
--- Russell Wallace <[EMAIL PROTECTED]> wrote:
> On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > Lossless video compression would not get far. The brightness of a pixel
> > depends on the number of photons striking the corresponding CCD sensor.
> The
> > randomness due to quantum mechan
On 10/4/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> Lossless video compression would not get far. The brightness of a pixel
> depends on the number of photons striking the corresponding CCD sensor. The
> randomness due to quantum mechanics is absolutely incompressible and makes up
> a significa
--- Russell Wallace <[EMAIL PROTECTED]> wrote:
> On 9/23/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > I realize that a language model must encode both the meaning of a text
> string
> > and its representation. This makes lossless compression an inappropriate
> test
> > for evaluating models o
Lossless compression can be far from what intelligence does because
structure of categorization that intelligence performs on the world
probably doesn't correspond to its probabilistic structure.
As I see it, intelligent system can't infer many universal laws that
will hold in the distant future a
On 9/23/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> I realize that a language model must encode both the meaning of a text string
> and its representation. This makes lossless compression an inappropriate test
> for evaluating models of visual or auditory perception. The tiny amount of
> releva
16 matches
Mail list logo