Re: [agi] Re: JAGI submission

2008-11-29 Thread Charles Hixson

Matt Mahoney wrote:

--- On Tue, 11/25/08, Eliezer Yudkowsky [EMAIL PROTECTED] wrote:

  

Shane Legg, I don't mean to be harsh, but your attempt to link
Kolmogorov complexity to intelligence is causing brain damage among
impressionable youths.

( Link debunked here:
  http://www.overcomingbias.com/2008/11/complexity-and.html
)



Perhaps this is the wrong argument to support my intuition that knowing more 
makes you smarter, as in greater expected utility over a given time period. How 
do we explain that humans are smarter than calculators, and calculators are 
smarter than rocks?

...

-- Matt Mahoney, [EMAIL PROTECTED]
  
Each particular instantiation of computing has a certain maximal 
intelligence that it can express (noting that intelligence is 
ill-defined).  More capacious stores can store more information.  Faster 
processors can process information more quickly.


However, information is not, in and of itself, intelligence.  
Information is the database on which intelligence operates.  Information 
isn't a measure of intelligence, and intelligence isn't a measure of 
information.  We have decent definitions of information.  We lack 
anything corresponding for intelligence.  It's certainly not complexity, 
though intelligence appears to require a certain amount of complexity.  
And it's not a relationship between information and complexity.


I still suspect that intelligence will turn out to be to what we think 
of as intelligence rather as a symptom is to a syndrome.  (N.B., not as 
a symptom is to a disease!)  That INTELLIGENCE will turn out to be 
composed of many, many, small little tricks that enable one to solve a 
certain class of problems quick...or even at all.  But that the tricks 
will have no necessary relation ship to each other.  One will be 
something like alpha-beta pruning and another will be hill-climbing and 
another quick-sort, and another...and another will be a heuristic for 
classifying a problem as to what tools might help solve it...and another
As such, I don't think that any AGI can exist.  Something more general 
than people, and certainly something that thinks more quickly than 
people and something that knows more than any person can...but not a 
truly general AI.


E.g., where would you put a map colorer for 4-color maps?  Certainly an 
AGI should be able to do it, but would you really expect it to do it 
more readily (compared to the speed of it's other processes) than people 
can?  If it could, would that really bump your estimate of it's 
intelligence that much?  And yet there are probably an indefinitely 
large number of such problems.  And from what it currently know, it's 
quite likely that each one would either need n^k or better steps to 
solve, or a specialized algorithm.  Or both.




---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com


Re: [agi] Re: JAGI submission

2008-11-26 Thread Matt Mahoney
--- On Tue, 11/25/08, Eliezer Yudkowsky [EMAIL PROTECTED] wrote:

 Shane Legg, I don't mean to be harsh, but your attempt to link
 Kolmogorov complexity to intelligence is causing brain damage among
 impressionable youths.
 
 ( Link debunked here:
   http://www.overcomingbias.com/2008/11/complexity-and.html
 )

Perhaps this is the wrong argument to support my intuition that knowing more 
makes you smarter, as in greater expected utility over a given time period. How 
do we explain that humans are smarter than calculators, and calculators are 
smarter than rocks?

Obviously that is not true with unlimited computing power. With a very simple 
program I could answer any question that could be proven by enumerating all 
proofs. In that world, if a problem requires around 33 computation steps, 
then you would need log 33 bits to specify the number of steps, which is 
essentially the same number. With real computers, I think the difference 
between O(t) and O(log t) complexity is important.

I realize that approximating real computers with Turing machines is not always 
justified.

-- Matt Mahoney, [EMAIL PROTECTED]



---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com


[agi] Re: JAGI submission

2008-11-25 Thread Eliezer Yudkowsky
On Mon, Nov 24, 2008 at 4:20 PM, Matt Mahoney [EMAIL PROTECTED] wrote:
 I submitted my paper A Model for Recursively Self Improving Programs to 
 JAGI and it is ready for open review. For those who have already read it, it 
 is essentially the same paper except that I have expanded the abstract. The 
 paper describes a mathematical model of RSI in closed environments (e.g. 
 boxed AI) and shows that such programs exist in a certain sense. It can be 
 found here:

 http://journal.agi-network.org/Submissions/tabid/99/ctrl/ViewOneU/ID/9/Default.aspx

*Thud.*

This was an interesting attempt to define RSI and I really thought you
were going to prove something interesting from it.

And then, at the last minute, on the last page - *thud*.

Shane Legg, I don't mean to be harsh, but your attempt to link
Kolmogorov complexity to intelligence is causing brain damage among
impressionable youths.

( Link debunked here:

  http://www.overcomingbias.com/2008/11/complexity-and.html )

-- 
Eliezer Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence


---
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com