On Tue, 2002-12-03 at 14:25, Ben Goertzel wrote:
> 
> To emulate the massively parallel "information update rate" of the brain on
> N bits of memory, how many commodity PC processors are required per GB of
> RAM?


It would take a lot of processors.  Not particularly fast ones, but a
lot of them.

The problem is that while you can buy computers with GB of RAM, only
extremely expensive supercomputers have a memory architecture that is
designed to be constantly accessed across its address space.  PC memory
architectures are highly optimized for the assumption that you are only
going to be working with a small section of the address space for any
particular function.  Cache thrashing and pipeline stalls will give a
real world performance substantially below the theoretical for this kind
of memory access pattern.  You probably would have a hard time achieving
even one operation/sec on a n-GB RAM structure simply because you can't
feed RAM structures that large to a PC processor, even if the processor
is theoretically capable of doing sufficient computation for a data
structure that large in under a second.

You might get more real throughput for the money using massive DSP
clusters, which are actually designed to be used like this. 
Alternatively, a PC cluster could work as well if you used very low
latency switch fabrics (like Myrinet or similar), but would cost a bit
of money.

-James Rogers
 [EMAIL PROTECTED]

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to