As Loosemore has argued, compression is a poor AGI test in general, as shown by the fact that humans are generally intelligent but are poor compressors! Some AGIs may be great compressors, others not.
Novamente as it happened, once it became highly generally intelligent, could be turned into a great compressor pretty easily. But that needn't be true of all AGI systems. Also, compression is a poor incremental test for progress toward AGI, because some systems that constituted considerable progress toward AGI might be terrible compressors. For instance, if someone built a robotic dog that was as good as a real dog at perception, cognition and action, I would consider that a big step toward powerful AGI. But dogs really suck at compression. (Yeah, their brains may carry out compression operations internally. But, if you give a dog an explicit compression problem to solve, it will not give a very useful or impressive answer...) -- Ben G On 4/24/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
--- Benjamin Goertzel <[EMAIL PROTECTED]> wrote: > I don't think there are any good, general incremental tests for progress > toward AGI. Compression? -- Matt Mahoney, [EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?&
----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936
