[agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Ben Goertzel
Hi, About the Hutter Prize (see the end of this email for a quote of the post I'm responding to, which was posted a week or two ago)... While I have the utmost respect for Marcus Hutter's theoretical work on AGI, and I do think this prize is an interesting one, I also want to state that I don't

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Shane Legg
Ben,So you think that, Powerful AGI == good Hutter test resultBut you have a problem with the reverse implication,good Hutter test result =/= Powerful AGIIs this correct? Shane To unsubscribe, change your address, or temporarily deactivate your subscription, please go to

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Matt Mahoney
A common objection to compression as a test for AI is that humans can't do compression, so it has nothing to do with AI. The reason people can't compress is that compression requires both AI and deterministic computation. The human brain is not deterministic because it is made of neurons,

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Ben Goertzel
Howdy Shane, I'll try to put my views in your format I think that Extremely powerful, vastly superhuman AGI == outstanding Hutter test result whereas Human-level AGI =/= Good Hutter test result just as Human =/= Good Hutter test result and for this reason I consider the Hutter test a

[agi] FYI: The Human Speechome Project

2006-08-12 Thread Pei Wang
See the paper at http://www.cogsci.rpi.edu/CSJarchive/Proceedings/2006/docs/p2059.pdf ABSTRACT: The Human Speechome Project is an effort to observe and computationally model the longitudinal course of language development of a single child at an unprecedented scale. The idea is this: Instrument

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Shane Legg
That seems clear.Human-level AGI =/= Good Hutter test result just asHuman =/= Good Hutter test resultMy suggestion then is to very slightly modify the test as follows: Instead of just getting the raw characters, what you get is thesequence of characters and the probability distribution over the

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Pei Wang
Matt, To summarize and generalize data and to use the summary to predict the future is no doubt at the core of intelligence. However, I do not call this process compressing, because the result is not faultless, that is, there is information loss. It is not only because the human brains are

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Shane Legg
Yes, I think a hybridized AGI and compression algorithm could dobetter than either one on its ownHowever, this might result in an incredibly slow compression process, depending on how fast the AGIthinks.(It would take ME a long time to carry out this process overthe whole Hutter

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Ben Goertzel
I don't think it's anywhere near that much. I read at about 2 KB per minute, and I listen to speech (if written down as plain text) at a roughly similar speed. If you then work it out, buy the time I was 20 I'd read/heard not more than 2 or 3 GB of raw text. If you could compress/predict

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Russell Wallace
On 8/12/06, Matt Mahoney [EMAIL PROTECTED] wrote: In order to compress text well, the compressor must be able to estimate probabilities over text strings, i.e. predict text. Um no, the compressor doesn't need to predict anything - it has the entire file already at hand. The _de_compressor would

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Matt Mahoney
First, the compression problem is not in NP. The general problem of encoding strings as the smallest programs to output them is undecidable.Second, given a model, then compression is the same as prediction. A model is a function that maps any string s to an estimated probability p(s). A compressor

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Russell Wallace
On 8/12/06, Matt Mahoney [EMAIL PROTECTED] wrote: First, the compression problem is not in NP. The general problem of encoding strings as the smallest programs to output them is undecidable. But as I said, it becomes NP when there's an upper limit to decompression time. Second, given a model,

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Shane Legg
But Shane, your 19 year old self had a much larger and more diversevolume of data to go on than just the text or speech that you ingested...I would claim that a blind and deaf person at 19 could pass aTuring test if they had been exposed to enough information overthe years. Especially if they had

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Russell Wallace
On 8/13/06, Matt Mahoney [EMAIL PROTECTED] wrote: Whether or not a compressor implements a model as a predictor or not is irrelevant. Modeling the entire input at once is mathematically equivalent to predicting successive symbols. Even if you think you are not modeling, you are. If you design a

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Pei Wang
Matt, So you mean we should leave forgetting out of the picture, just because we don't know how to objectively measure it. Though objectiveness is indeed desired for almost all measurements, it is not the only requirement for a good measurement of intelligence. Someone can objectively measure a

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Yan King Yin
I think compression isessential tointelligence,but the difference between lossy and lossless may make the algorithms quite different. But why notlet competitorscompress lossily?As far asprediction goes, the testing part is still the same! If you guys have a lossy version of the prize I will

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Matt Mahoney
Hutter's only assumption about AIXI is that the environment can be simulated by a Turing machine. With regard to forgetting, I think it plays a minor role in language modeling compared to vision and hearing. To model those, you need to understand what the brain filters out. Lossy compression

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Russell Wallace
On 8/13/06, Matt Mahoney [EMAIL PROTECTED] wrote: There is no knowledge that you can demonstrate verbally that cannot also be learned verbally. An unusual claim... do you mean all knowledge can be learned verbally, or do you think there are some kinds of knowledge that cannot be demonstrated

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread J. Andrew Rogers
On Aug 12, 2006, at 6:27 PM, Yan King Yin wrote: I think compression is essential to intelligence, but the difference between lossy and lossless may make the algorithms quite different. For general algorithms (e.g. ones that do not play to the sensory biases of humans) there should be

Re: [agi] Marcus Hutter's lossless compression of human knowledge prize

2006-08-12 Thread Eliezer S. Yudkowsky
As long as we're talking about fantasy applications that require superhuman AGI, I'd be impressed by a lossy compression of Wikipedia that decompressed to a non-identical version carrying the same semantic information. -- Eliezer S. Yudkowsky http://singinst.org/