It's all in the implicit and explicit assumptions.

For a structural engineer, there is a very well-defined (usually legally
defined) set of criteria for what a bridge must support and tolerate. Is
this a public roadway? Is it a private roadway for government-only use? How
much of what kind of traffic is expected to use it? Of course, there are a
whole slew of environmental parameters as well.

For the software engineer, none of this is commonly defined, much less
legally defined. We have no common understanding of what the safe operation
of that button constitutes.

We *think* we know what it should do, but in reality we don't.


On Tue, Apr 29, 2014 at 4:22 PM, Jacob Torrey <ja...@jacobtorrey.com> wrote:

> Perhaps to play Devil's Advocate a bit (I agree that the engineering
> process is mostly non-existent in this field), I am curious where the
> comparison between a structural engineer building a bridge to specification
> to allow traffic to safely drive across it, and a software engineer
> building a widget to specification to allow a user to click on a button
> breaks down. With software, there is a much more aggressive adversary
> model, as for some reason, attacking "cyber" systems is more okay than
> physical systems. An adversary can overflow a buffer in software, or over
> weight a bridge, both with negative consequences. Are we talking about
> building software to address being under near-constant attack? Would the
> engineering firm be held liable for a bridge that failed when hit with an
> RPG?
>
> Jacob
>
>
> On Tue, Apr 29, 2014 at 2:15 PM, Alex Gantman 
> <agant...@qti.qualcomm.com>wrote:
>
>> At 07:27 AM 4/29/2014, d...@geer.org wrote:
>>
>>   | Mechanical Engineering ... Electrical Engineering ... Civil
>>> Engineering ...
>>>  | these all are formal disciplines where one may obtain a license
>>> stating
>>>  | that they know how to apply the principles of the domain of study. If
>>> you
>>>  | screw up and something fails, it comes back to you in full legal
>>> regalia.
>>>  |
>>>  | Software "engineering" has never been such, and while I do recall
>>> studying
>>>  | formulas, performing experiments, etc. back in school for things like
>>>  | database or graphics performance, there was no formal study of the
>>> way code
>>>  | is assembled. Anyone who could make a program that satisfied the
>>> criteria
>>>  | of the assignment got credit.
>>>
>>> Would you go so far as to call for product liability?
>>>
>>> --dan
>>>
>>
>> Few physical goods vendors are liable for damage inflicted on their
>> products by acts of vandalism.  There's a reason why the vending machine at
>> the train station looks and costs differently from a home espresso maker,
>> or a bus seat from a leather recliner.
>>
>> I do agree that we are a far cry from engineering.  I found the following
>> story interesting and quite relevant to security work.
>>
>> http://www.science.smith.edu/~jcardell/Courses/EGR100/
>> protect/reading/59StoryCrisis.pdf
>>
>> One thing that it made me reconsider is the adage that "security is
>> everyone's responsibility."  Realizing that metaphors have limits, it
>> occurred to me that in construction structural integrity is not everyone's
>> responsibility.  It is the responsibility of the structural engineer.
>>  Carpenters and metalworkers have to stay faithful to the plans, but do not
>> need to fully understand the calculations behind them.  Architects need to
>> be conversant in the field, but still need a real structural engineer to
>> perform the design work.
>>
>> -Alex
>> _______________________________________________
>> langsec-discuss mailing list
>> langsec-discuss@mail.langsec.org
>> https://mail.langsec.org/cgi-bin/mailman/listinfo/langsec-discuss
>>
>
>
_______________________________________________
langsec-discuss mailing list
langsec-discuss@mail.langsec.org
https://mail.langsec.org/cgi-bin/mailman/listinfo/langsec-discuss

Reply via email to