On Tue, 29 Jul 2008 11:40:39 PDT "Roman V. Shaposhnik" <[EMAIL PROTECTED]>  
wrote:
> [EMAIL PROTECTED] wrote:
> > On the same subject, this quote from Donald E. Knuth, Volume 4 
> > fascicle 0 (new addition to The Art of Computer Programming, published
> > in may 2008)---Preface:
> >
> >     "Furthermore, as in earlier volumes of this serie, I'm
> >     intentionnally concentrating almost entirely on _sequential_
> >     algorithms, even though computers are increasingly able to carry out
> >     activities in parallel. I'm unable to judge what ideas about
> >     parallelism are likely to be useful five or ten years from now, let
> >     alone fifty, so I happily leave such questions to others who are
> >     wiser than I. Sequential methods, by themselves, already test the
> >     limits of my own ability to discern what the artful programmers of
> >     tomorrow will want to know."
> >   
> I believe this is the biggest point in all of the hype around 
> concurrency as the
> next  programming paradigm: it is very hard to approach the algorithmic
> side of it. And no, I'm not talking locking-hygiene, I'm talking design and
> implementation  of  basic  (and no so basic ) algorithms.

Recently I stumbled upon something that seems appropriate here:
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD05xx/EWD506.html

    Bill Wulf raised the question: "If there were a Nobel prize for
    computing science, what would be the next achievement in our field
    worthy of it?"        ...

    The next achievement Bill Wulf was asking for might very well take
    the form of successfully challenging one of our common "tacit
    assumptions". Von Neumann's "instruction counter" and the notion of
    "a sequential process" seems the most likely victim: any workable
    conceptual framework in which "parallel programming" becomes as
    meaningless a term as "sequential programming" could be a worthy
    candidate for computing science's Nobel prize!

It is slightly depressing to think that the situation has not really
changed since EWD wrote this in 1975.  It will take some young
whippersnapper of a Dijkstra or Hoare or Strachey or Iverson or Backus
to find the critical insight that will make reasoning about parallel
algorithm no more difficult than sequential ones.

Reply via email to