James Bowery <[email protected]> wrote:

> Architectures that attempt to hide this problem with lots of processors
> accessing local stores in parallel are drunks looking for their keys under
> the lamp post.
>

I disagree. The purpose of a computer is solve problems. To process data.
Not to crunch numbers as quickly as possible. The human brain is many
orders of magnitude slower than any computer, and yet we can recognize
faces faster than just about any computer, because the brain is a massively
parallel processor (MPP). Many neurons compare the image to stored images
simultaneously, and the neurons that find the closest match "come to mind."
Many data processing functions can be performed in parallel. Sorting and
searching arrays has been done in parallel since the 1950s. Polyphase sort
methods with multiple processors and mag tape decks were wonderfully fast.

It is difficult to write MPP software, but once we master the techniques
the job will be done, and it will be much easier to update. Already,
Microsoft Windows works better on multi-processor computers than single
processor models. Multiprocessor also run voice input programs much faster
than single processors.

A generation from now we may have personal computers with millions of
processors. Even if every processor were much slower than today's
processors, the overall speed for many classes of problems will be similar
to today's supercomputers -- which can solve problems hundreds of thousands
to millions of times faster than a PC or Mac. They will have the power of
today's Watson computer, which is to say, they will be able to play
Jeopardy or diagnose disease far better than any person. I expect they will
also recognize faces and do voice input better than any person.

There may be a few esoteric problems that are inherently serial in nature
and that can only be solved by a single processor, but I expect most real
world can be broken down into procedures run in parallel. Of course the
breaking down will be done automatically. It is already.

Before computers were invented, all large real world problems were broken
down and solved in parallel by large groups of people, usually organized in
a hierarchy. I mean, for example, the design of large buildings or the
management of corporations, nations or armies.

The fastest data processing in the known universe, by a wide margin, is
biological cell reproduction. The entire genome is copied by every cell
that splits. This is a parallel process. The moment a strand of DNA is
exposed to solution, all of new bases begin match up simultaneously. DNA is
also by far the most compact form of data storage in the known universe,
and I predict is the most compact that will ever be found. I do not think
subatomic data storage will ever be possible. All the human data now
existing can be stored in about 7 ml of DNA.

- Jed

Reply via email to