On Tue, Apr 3, 2012 at 8:25 AM, Eugen Leitl <[email protected]> wrote:

> It's not just imperative programming. The superficial mode of human
> cognition is sequential. This is the problem with all of mathematics
> and computer science as well.
>

Perhaps human attention is basically sequential, as we're only able to
focus our eyes on one point and use two hands.  But I think humans
understand parallel behavior well enough - maintaining multiple
relationships, for example, and predicting the behaviors of multiple people.


>
> If you look at MPI debuggers, it puts people into a whole other
> universe of pain that just multithreading.
>

I can think of a lot of single-threaded interfaces that put people in a
universe of pain. It isn't clear to me that distribution is at fault there.
;)

In any case, message passing and event models are still clinging tightly to
their imperative heritage.


>
> > Dataflows and pipelines can be parallelized without issue and remain
> > deterministic. If we push more of the parallelism to the code, the
> hardware
> > can also be less complex - i.e. less architecture to hide latency for
> > memory access.
>
> Global memory doesn't scale in a relativistic universe. Ditto cache
> coherence for already reasonable small number of caches.
>
> So, we don't really have a choice other to stop worrying, and learn
> to love parallelism.
>

True, but we can also make it a lot less complex.

Regards,

Dave

-- 
bringing s-words to a pen fight
_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to