The advertisement claims that an OS can be installed on each core including
linux.   So I don't know why you say it's no good for general computing.
I think you mean each core is far less powerful than a core 2 duo core?

I have not looked at the overview paper yet,  but it seems like there is no
point having 64 on a chip if each is severely crippled up, unless of course
there is something they can do much better than a general purpose CPU.
Would you estimate that applications that are pure computation and not
memory bandwidth would run significantly faster on one of these?

I fear we are going to get to the point where we cannot utilize more
processors very effectively without taking huge compromises.   We will
always be able to utilize more to a degree, but we will get to the point
where doubling the number of cores only adds 10% to the speed of the
computation.    We will change our algorithms and adapt to this, but we will
always be working around the problems that require some degree of
serialization.     Tree search can never be fully parallel and remain as
efficient as a serial algorithm.

- Don





On Sat, Jun 13, 2009 at 4:21 AM, David Fotland <[email protected]>wrote:

> Lots of simpler cores is possible, but only for running specialized code
> that doesn’t need much memory or memory bandwidth.  If I have thousands of
> cores with small caches the total bandwidth to off-chip memory will be way
> too high, and performance will be limited by external memory throughput.
>
> Look at Tilera http://www.tilera.com/products/TILEPro64.php  64 cores on a
> chip in the same technology as Intel used to get two cores on a chip.  But
> local memory is small, so it's no good for general computing.  Someone
> might
> try it for computer go though.
>
> David
>
> > -----Original Message-----
> > From: [email protected] [mailto:computer-go-
> > [email protected]] On Behalf Of Mark Boon
> > Sent: Friday, June 12, 2009 1:59 PM
> > To: computer-go
> > Subject: Re: [computer-go] MCTS, 19x19, hitting a wall? moore's law
> limits
> >
> > 2009/6/10 David Fotland <[email protected]>:
> > > I think we will get another 64x to 256 x density then it will stop, for
> > > single chips.  We should eventually get desktop machines with thousands
> > of
> > > cores, but probably never with millions of cores.  There really are
> > limits
> > > built into physics L
> > >
> >
> > How about the cores becoming much smaller and simpler?
> >
> > Intel's CPUs are approaching a billion transistors on a chip. But you
> > can probably make a very decent and fast CPU with just a million
> > transistors. Maybe double that number to give each a bit of cache
> > memory. If you can see computers with thousands of cores, does that
> > already assume they'll be simpler? Or could we have a few (hundred)
> > heavy-duty CPUs like today's for multi-purpose use and a card with a
> > million simpler CPUs on them next to it for tasks suitable for
> > parallel processing? A hybrid system if you will.
> >
> > Just thinking out loud, I'm obviously a layman when it comes to
> > semiconductors.
> >
> > Mark
> > _______________________________________________
> > computer-go mailing list
> > [email protected]
> > http://www.computer-go.org/mailman/listinfo/computer-go/
>
> _______________________________________________
> computer-go mailing list
> [email protected]
> http://www.computer-go.org/mailman/listinfo/computer-go/
>
_______________________________________________
computer-go mailing list
[email protected]
http://www.computer-go.org/mailman/listinfo/computer-go/

Reply via email to