Joel Kamentz wrote:
Re: bridges and stuff.
I'm tempted to argue (though not with certainty) that it seems that the bridge
analogy is flawed
in another way --
that of the environment. While many programming languages have similarities
and many things apply
to all programming,
there are many things which do not translate (or at least not readily). Isn't
this like trying to
engineer a bridge
with a brand new substance, or when the gravitational constant changes? And
even the physical
with the unexpected -- corrosion, resonance, metal fatigue, etc. To their
credit, they appear far
dispersing and applying the knowledge from past failures than the software
Corrosion, resonance, metal fatigue all have counterparts in the
software world. glibc flaws, kernel flaws, compiler flaws. Each of
these is an outside influence on the application - just as environmental
stressors are on a physical structure.
Engineering problems disperse faster because of law suits that happen
when a bridge fails. I'm still waiting for a certain firm located in
Redmond to be hauled into court - and until that happens, nobody is
going to make security an absolute top priority.
Let's use an example someone else already brought up -- cross site scripting.
How many people
feel that, before it
was ever known or had ever occurred the first time, good programming practices
prevented any such
vulnerability from ever happening? I actually think that would have been
possible for the
extremely skilled and
extremely paranoid. However, we're asking people to protect against the
Hardly unknowns. Not every possiblity has been enumerated, but then
again, not every physical phenomena has been experienced w/r/t
I don't have experience with the formal methods, but I can see that, supposing
this were NASA,
etc., formal approaches
might lead to perfect protection. However, all of that paranoia, formality or
whatever takes a
lot of time, effort
and therefore huge economic impact. I guess my personal opinion is that unit
testing, etc. are
(compared to perfect) which help reduce flaws, but with lesser expense.
Unit testing is fine, but tests "inside the box" and doesn't veiw your
system through the eyes of an attacker.
All of this places me in the camp that thinks there isn't enough yet to
standardize. Perhaps a
environment (language, VM, automation of various sorts, direct neural
interfaces) is required
before the art of
software is able to match the reliability and predictability of other fields?
You're tossing tools at the problem. The problem is inherently human
and economically driven. A hammer doesn't cause a building to be
Is software more subject to unintended consequences than physical engineering?
not "more subject", just "subject differently".