Eliezer S. Yudkowsky pointed out in a 2003 agi post titled Breaking
Solomonoff induction... well, not really [1] that
Solomonoff Induction is flawed because it fails to incorporate anthropic
reasoning. But apparently he thought this doesn't really matter because in
the long run Solomonoff
--- On Fri, 6/20/08, Wei Dai [EMAIL PROTECTED] wrote:
Eliezer S. Yudkowsky pointed out in a 2003 agi
post titled Breaking
Solomonoff induction... well, not really [1] that
Solomonoff Induction is flawed because it fails to
incorporate anthropic reasoning. But apparently he
thought this