> From: Matt Mahoney [mailto:[EMAIL PROTECTED]
> 
> --- On Sun, 9/7/08, John G. Rose <[EMAIL PROTECTED]> wrote:
> 
> > From: John G. Rose <[EMAIL PROTECTED]>
> > Subject: RE: Language modeling (was Re: [agi] draft for comment)
> > To: agi@v2.listbox.com
> > Date: Sunday, September 7, 2008, 9:15 AM
> > > From: Matt Mahoney [mailto:[EMAIL PROTECTED]
> > >
> > > --- On Sat, 9/6/08, John G. Rose
> > <[EMAIL PROTECTED]> wrote:
> > >
> > > > Compression in itself has the overriding goal of
> > reducing
> > > > storage bits.
> > >
> > > Not the way I use it. The goal is to predict what the
> > environment will
> > > do next. Lossless compression is a way of measuring
> > how well we are
> > > doing.
> > >
> >
> > Predicting the environment in order to determine which data
> > to pack where,
> > thus achieving higher compression ratio. Or compression as
> > an integral part
> > of prediction? Some types of prediction are inherently
> > compressed I suppose.
> 
> Predicting the environment to maximize reward. Hutter proved that
> universal intelligence is a compression problem. The optimal behavior of
> an AIXI agent is to guess the shortest program consistent with
> observation so far. That's algorithmic compression.
> 

Oh I see. Guessing shortest program = compression. OK right. But yeah like
Pei said the word "compression" is misleading. It implies a reduction where
you are actually increasing understanding :)

John




-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to