Re: bridges and stuff.

I'm tempted to argue (though not with certainty) that it seems that the bridge 
analogy is flawed
in another way --
that of the environment.  While many programming languages have similarities 
and many things apply
to all programming,
there are many things which do not translate (or at least not readily).  Isn't 
this like trying to
engineer a bridge
with a brand new substance, or when the gravitational constant changes?  And 
even the physical
disciplines collide
with the unexpected -- corrosion, resonance, metal fatigue, etc.  To their 
credit, they appear far
better at
dispersing and applying the knowledge from past failures than the software 
world.

Let's use an example someone else already brought up -- cross site scripting.  
How many people
feel that, before it
was ever known or had ever occurred the first time, good programming practices 
should have
prevented any such
vulnerability from ever happening?  I actually think that would have been 
possible for the
extremely skilled and
extremely paranoid.  However, we're asking people to protect against the 
unknown.

I don't have experience with the formal methods, but I can see that, supposing 
this were NASA,
etc., formal approaches
might lead to perfect protection.  However, all of that paranoia, formality or 
whatever takes a
lot of time, effort
and therefore huge economic impact.  I guess my personal opinion is that unit 
testing, etc. are
great shortcuts
(compared to perfect) which help reduce flaws, but with lesser expense.

All of this places me in the camp that thinks there isn't enough yet to 
standardize.  Perhaps a
new programming
environment (language, VM, automation of various sorts, direct neural 
interfaces) is required
before the art of
software is able to match the reliability and predictability of other fields?

Is software more subject to unintended consequences than physical engineering?

Joel


Reply via email to