Matt Mahoney wrote:
--- On Tue, 11/25/08, Eliezer Yudkowsky <[EMAIL PROTECTED]> wrote:

Shane Legg, I don't mean to be harsh, but your attempt to link
Kolmogorov complexity to intelligence is causing brain damage among
impressionable youths.

( Link debunked here:
  http://www.overcomingbias.com/2008/11/complexity-and.html
)

Perhaps this is the wrong argument to support my intuition that knowing more 
makes you smarter, as in greater expected utility over a given time period. How 
do we explain that humans are smarter than calculators, and calculators are 
smarter than rocks?

...

-- Matt Mahoney, [EMAIL PROTECTED]
Each particular instantiation of computing has a certain maximal intelligence that it can express (noting that intelligence is ill-defined). More capacious stores can store more information. Faster processors can process information more quickly.

However, information is not, in and of itself, intelligence. Information is the database on which intelligence operates. Information isn't a measure of intelligence, and intelligence isn't a measure of information. We have decent definitions of information. We lack anything corresponding for intelligence. It's certainly not complexity, though intelligence appears to require a certain amount of complexity. And it's not a relationship between information and complexity.

I still suspect that intelligence will turn out to be to "what we think of as intelligence" rather as a symptom is to a syndrome. (N.B., not as a symptom is to a disease!) That "INTELLIGENCE" will turn out to be composed of many, many, small little tricks that enable one to solve a certain class of problems quick...or even at all. But that the tricks will have no necessary relation ship to each other. One will be something like alpha-beta pruning and another will be hill-climbing and another quick-sort, and another...and another will be a heuristic for classifying a problem as to what tools might help solve it...and another.... As such, I don't think that any AGI can exist. Something more general than people, and certainly something that thinks more quickly than people and something that knows more than any person can...but not a truly general AI.

E.g., where would you put a map colorer for 4-color maps? Certainly an AGI should be able to do it, but would you really expect it to do it more readily (compared to the speed of it's other processes) than people can? If it could, would that really bump your estimate of it's intelligence that much? And yet there are probably an indefinitely large number of such problems. And from what it currently know, it's quite likely that each one would either need n^k or better steps to solve, or a specialized algorithm. Or both.



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com

Reply via email to