Can be built upon in theory, but in practice they are with high probability thrown away and rebuilt from scratch.

Because they are complex linguistic objects which, like philosophers' arguments, are often harder to figure out than to do over from first principles.

-- rec --
Well said Roger.   I think we have all had the experience of choosing to:
  1. rederive an equation
  2. redesign an algorithm
  3. rewrite a piece of code
  4. rehash a philosophical argument
when there was already a perfectly good (maybe) equation/algorithm/program/argument layed at our feet.

For me, this is a mixture of:

  1. Pride
  2. Trust
  3. Understanding

This was a major challenge to code re-use at one time...  about the only published Algorithms many of us trusted were those published in Knuth!  And even then, we had to rewrite the actual code in whatever context we were operating in, and were generally proud to do it.  But we couldn't help  noodling on the algorithms, trying to think of a new, more elegant, more general, or more efficient way of solving the problem at hand.

Often, by the time one has worked through an algorithm forward to backward, backward to forward, one might as well have designed it.  The existing algorithm provides a few important things, however:
  1. Existence proof.  We *know* there is at least one algorithm that achieves the desired result.
  2. Hints.  Even if we don't understand an algorithm on first blush, we usually get a sense of it's arc.
  3. Reference.  When we think we've got ours right, we can go back and test it against the original.  In fact, we probably now understand the original as well as our own and if we have any humility may throw *ours* away and use the one made available to us in the first place
For proceduralists, C's concise method of creating complex objects (data structures and pointers to them) made it possible to build good/interesting/understandable libraries from tried and true algorithms.  With the introduction of ObjC/C++/Java and a huge new influx of programmers (the Internet may not have *created* new programmers, but it connected them to the rest of us in a way that was unprecedented), the number of libraries went up dramatically.   With the Open Source movement, these accreted and evolved into some pretty interesting, trusted and useful libraries.   Today, only a hardcore group like this crowd here are likely to rederive/redesign/rewrite/rehash.

It is a dying art, which I am nostalgic about.  It is one of the entertainments I find on this list.   I liken what happens here (sometimes) to the WPA era when the very few remaining craftsmen in many building arts were found and encouraged/supported to building some last monuments to an old era of craftsmanship that no longer exists. 

Maybe huge systems built of handcut C (assembly?)  code, implementing custom algorithms are not as obviously beautiful or elegant as some of the grand WPA era National Park resorts (Mt Hood, Grand Canyon Lodge, Yosemite, ...)  but there is a similarity in the values and the processes.

I respect Nick and others here for wanting to apply the same principles to the context of our various constructions as well.  I don't always (even try to) follow the arguments but I appreciate the desire to (re)hash the hash, even if I don't always want to participate in it.

- Steve
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to