YKY,

I agree with your side of the debate about whole KB not fitting into RAM.  As a 
solution, I propose to partition the whole KB into the tiniest possible cached 
chunks, suitable for a single agent running on a host computer with RAM 
resources of at least one GB.  And I propose that AGI will consist not of one 
program running on one computer, but a vast multitude of separately hosted 
agents working in concert.

But my opinion of the OpenCyc concept coverage with respect to that of a human 
five-year old differs greatly from yours.  I concede that 200000 OpenCyc facts 
are about the number a child might know, but in order to properly ground these 
concepts, I believe that a much larger number of feature vectors will have to 
be stored or available in abstracted form.   For example, there is the concept 
of the child's mother.  Properly grounding that one concept might require 
abstracting features from thousands of observations:
wet hair motherfar away motherangy mothermother hidden from viewmother in a 
crowdmother's voicemother in dim lightmother from belowand so on
Of course you can ignore fully grounded concepts as does current Cycorp for its 
applications, and as I will with Texai until it is past the bootstrap stage.

-Steve


Stephen L. Reed

Artificial Intelligence Researcher
http://texai.org/blog
http://texai.org
3008 Oak Crest Ave.
Austin, Texas, USA 78704
512.791.7860

----- Original Message ----
From: YKY (Yan King Yin) <[EMAIL PROTECTED]>
To: [email protected]
Sent: Thursday, April 17, 2008 3:58:43 PM
Subject: Re: [agi] database access fast enough?

 On 4/18/08, Mark Waser <[EMAIL PROTECTED]> wrote:
> > > Yes.  RAM is *HUGE*.  Intelligence is *NOT*.
> >
> > Really?  I will believe that if I see more evidence... right now I'm
> skeptical.
>
> And your *opinion* has what basis?  Are you arguing that RAM isn't huge?
> That's easily disprovable.  Or are you arguing that intelligence is huge?
> That too is easily disprovable.  Which one do I need to knock down?

The current OpenCyc KB is ~200 Mbs (correct me if I'm wrong).

The RAM size of current high-end PCs is ~10 Gbs.

My intuition estimates that the current OpenCyc is only about 10%-40%
of a 5 year-old human intelligence.

Plus, learning requires that we store a lot of hypotheses.  Let's say
1000-10000 times the real KB.

That comes to 500Gb - 20Tb.

It seems that if we allow several years for RAM size to double a few
times, RAM may have a chance to catch up to the low end.  Obviously
not now.

YKY

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: http://www.listbox.com/member/?&;
Powered by Listbox: http://www.listbox.com







      
____________________________________________________________________________________
Be a better friend, newshound, and 
know-it-all with Yahoo! Mobile.  Try it now.  
http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com

Reply via email to