Hi,
There isn't much that an MIMD machine can do better than a similar-sized
SIMD machine.
Hey, that's just not true.
There are loads of math theorems disproving this assertion...
Oops, I left out the presumed adjective real-world. Of course there are
countless diophantine equations
Ben,
On 12/12/08, Ben Goertzel b...@goertzel.org wrote:
There isn't much that an MIMD machine can do better than a
similar-sized
SIMD machine.
Hey, that's just not true.
There are loads of math theorems disproving this assertion...
Oops, I left out the presumed adjective
Andi and Ben,
On 12/12/08, wann...@ababian.com wann...@ababian.com wrote:
I don't remember what references there were earlier in this thread, but I
just saw a link on reddit to some guys in Israel using a GPU to greatly
accelerate a Bayesian net. That's certainly an AI application:
Steve wrote:
Bit#3: Did Ben realize that the prospective emergence of array processors
(e.g. as I have been promoting) would obsolete much of his present
work, because its structure isn't vectorizable, so he is in effect betting
on continued stagnation in processor architecture, and may in
Ben,
Before I comment on your reply, note that my former posting was about my
PERCEPTION rather than the REALITY of your understanding, with the
difference being taken up in the answer being less than 1.00 bit of
information.
Anyway, that said, on with a VERY interesting (to me) subject.
On
Hi,
There isn't much that an MIMD machine can do better than a similar-sized
SIMD machine.
Hey, that's just not true.
There are loads of math theorems disproving this assertion...
OO and generic design patterns do buy you *something* ...
OO is often impossible to vectorize.
The point
Ben,
On 12/11/08, Ben Goertzel b...@goertzel.org wrote:
There isn't much that an MIMD machine can do better than a similar-sized
SIMD machine.
Hey, that's just not true.
There are loads of math theorems disproving this assertion...
Oops, I left out the presumed adjective real-world. Of