--- On Sun, 12/28/08, Philip Hunt <[email protected]> wrote:

> Now, consider if I build a program that can predict how
> some sequences will continue. For example, given
> 
>    ABACADAEA
> 
> it'll predict the next letter is "F", or
> given:
> 
>   1 2 4 8 16 32
> 
> it'll predict the next number is 64.

Please remember that I am not proposing compression as a solution to the AGI 
problem. I am proposing it as a measure of progress in an important component 
(prediction). Neither zip nor any entry in the Loebner contest will predict the 
next item in these sequences because they aren't very intelligent. The 
challenge for you is to solve problems like this. If you write a Loebner prize 
entry that does, it has a greater chance of winning. If you write a compressor 
that does, it will compress smaller because it will be able to assign smaller 
codes to the predicted symbols.

Random Turing machines are more likely to generate sequences with recognizable 
patterns than sequences without. That is my justification for testing with such 
data. There are many machine learning algorithms that are faster than AIXI^tl 
(randomly guessing machines) at recognizing these patterns. Obviously we must 
use some of them, or we could never solve such problems. The challenge for you 
is to discover these algorithms.

-- Matt Mahoney, [email protected]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com

Reply via email to