I agree that ONE end goal of software security is to safeguard data - but it is 
not the only goal...and may not even be the primary goal, depending on the type 
of system the software is part of. In a safety-critical system, "safeguard the 
data" takes on a very different meaning from what one thinks of in a typical 
information system. Yes, I may in fact be trying to safeguard "input" sent from 
logical or physical sensors so that the data can't be tampered with in a way 
that can threaten the safe operation of the system. But safeguarding the data 
in that case is only a means to an end - the main goal is to prevent someone 
from intentionally exploiting a flaw in the software in order to instigate a 
physical failure that could threaten health, lives, the environment, etc. 

===
Karen Mercedes Goertzel, CISSP
Lead Associate
Booz Allen Hamilton
703.698.7454
goertzel_ka...@bah.com

"If you're not failing every now and again,
it's a sign you're not doing anything very innovative."
- Woody Allen

________________________________________
From: sc-l-boun...@securecoding.org [sc-l-boun...@securecoding.org] on behalf 
of Jeffrey Walton [noloa...@gmail.com]
Sent: 21 September 2013 00:24
To: Rafal Los
Cc: Secure Coding List; Bobby G. Miller
Subject: [External]  Re: [SC-L] Sad state of affairs

On Fri, Sep 20, 2013 at 11:34 PM, Rafal Los <ra...@ishackingyou.com> wrote:
>
> Wait a minute, this relationship is a bit confused I think. Prasad said it 
> well- often the result of a maturing software security program is that the 
> simple and easy bugs disappear and the ones that are left are difficult to 
> find and complex in exploitation.
>
> This is known as eliminating the "low hanging fruit". While this doesn't 
> eliminate ALL bugs, I ultimately believe that's a fools' errand anyway. 
> Making the software as free of bugs as possible necessarily makes the ones 
> left in the system difficult to find and exploit. Then you work in good 
> anomaly detection mechanisms and have a great case for *reasonably* secure 
> software.
>
Well, the end goal of software security is to safe guard the data. All
a bad guy wants to do is collect, egress and monetize the data (sans
National Security concerns). If the data is not safe, then the
definition of "reasonable" has problems.

Consider: I was part of two breaches. The one in the 1990's cost me
about $10,000 to fix (I found out after I was sued). The second was in
New York last summer that cost me $75 to fix (have a card re-issued
and shipped next-day service).

If you ask the companies involved if their processes were reasonable,
they would probably say YES. After all, the companies "followed best
practices", minimized their losses and maximized their profits. If you
ask me, I would say NO.

Picking low hanging fruit is not enough. Ironically, we're not even
doing that very well (as BM noted). If you don't agree, take some time
to cruise ftp.gnu,org and look at the state of those projects (and its
not just free software). But I consider it a failure of security
professionals since its our job to educate developers and improve
their processes.*

> Of course, this is all predicated on you knowing and being able to define the 
> word reasonable.
:)

> Just my opinion.
And my jaded opinion :)

Jeff

* There's some hand waiving here since some (many?) argue its a waste
of time and money to teach developers; and the money is better spent
on building tools that make it hard/difficult to do things incorrectly
in the first place. I kind of think its a mixture of both.

> ----- Reply message -----
> From: "Jeffrey Walton" <noloa...@gmail.com>
> To: "Bobby G. Miller" <b.g.mil...@gmail.com>
> Cc: "Secure Coding List" <sc-l@securecoding.org>
> Subject: [SC-L] Sad state of affairs
> Date: Fri, Sep 20, 2013 10:01 PM
>
>
> On Fri, Sep 20, 2013 at 7:47 PM, Bobby G. Miller <b.g.mil...@gmail.com> wrote:
>> I was just listening to a podcast interviewing a security executive from a
>> prominent vendor.  The response to vulnerabilities was to raise the
>> cost/complexity of exploiting bugs rather than actually employing secure
>> coding practices.  What saddened me most was that the approach was
>> apparently effective enough.
> +1. Software security is in a sad state. What I've observed: let the
> developers deliver something, then have it pen tested, and finally fix
> what the pen testers find. I call it "catch me if you can" security.
>
> I think the underlying problem is the risk analysis equations. Its
> still cost effective to do little or nothing. Those risk analysis
> equations need to be unbalanced.
>
> And I don't believe this is the solution:
> http://searchsecurity.techtarget.com/opinion/Congress-should-encourage-bug-fixes-reward-secure-systems.
> Too many carrots and too few sticks means it becomes more profitable
> to continue business as usual.

_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
_______________________________________________

_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
_______________________________________________

Reply via email to