Text compression would be AGI-complete but I think it is still too big.
The problem is the source of knowledge. If you restrict to mathematical
expressions then the amount of data necessary to teach the AGI is probably
much smaller. In fact AGI could teach itself using a current theorem prover.

-Matthias


Matt Mahoney wrote


I have argued that text compression is just such a problem. Compressing
natural language dialogs implies passing the Turing test. Compressing text
containing mathematical expressions implies solving those expressions.
Compression also allows for precise measurements of progress.

Text compression is not completely general. It tests language, but not
vision or embodiment. Image compression is a poor test for vision because
any progress in modeling high level visual features is overwhelmed by
incompressible low level noise. Text does not have this problem.

Hutter proved that compression is a completely general solution in the AIXI
model. (The best predictor of the environment is the shortest model that is
consistent with observation so far). However, this may not be very useful as
a test, because it would require testing over a random distribution of
environmental models rather than problems of interest to people such as
language.

-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to