I would argue that the "security 'bugs'" you've described are in fact
functional deficiencies in the implemented design. That is, the exploit
of them has a direct impact on functional performance of the
application, even if it's just a problem with error handling ("input
validation").

I would further argue that treating security as a special case ends up
doing us more harm than good. Doing so allows developers, designers, and
the business to shrug it off as Somebody Else's Problem (SEP), instead
of owning it themselves. The same goes for the requirements, design, etc.

As an industry, we've developed segments of specialized knowledge, and
then have the audacity to complain about it not being mainstream. It's
time we picked one, and I would argue that mainstreaming these concepts
will be far more effective than continuing as a specialized bolt-on
discipline (which is not to say that specialized research should not
occur, just that in "real life" the application of this knowledge should
not be specialized, per se).

*shrug* The only way I see to win the game is to change the rules and/or
the game play itself. We must never forget that the security industry
still relies (heavily) on many of the same concepts that protected us 15
years ago (i.e. signature-based scans and ACLs - AV+firewall).

-ben

Goertzel, Karen [USA] wrote:
> No - that isn't really what I meant. There CAN be security "bugs" -
> i.e., implementation errors with direct security implications, such
> as a divide-by-zero error that allows a denial of service in a
> security-critical component, thus exposing what is supposed to be
> protected data.
> 
> But there are also bad security decisions - these can be at the
> requirements spec or design spec level. If they're at the
> requirements spec level, they aren't "bugs" - they are either
> omissions of good security or commissions of bad security. An
> omission of good security is not encrypting a password. That isn't a
> "bug" per se - unless it's a violation of policy. But if there's no
> password encryption policy, then the failure to include a requirement
> to encrypt passwords would not be a "bug" or a violation of any sort
> (except a violation of common sense). It would still, however, result
> in poor security.
> 
> -- Karen Mercedes Goertzel, CISSP Booz Allen Hamilton 703.698.7454 
> goertzel_ka...@bah.com
> 
> 
> 
> 
> -----Original Message----- From: Benjamin Tomhave
> [mailto:list-s...@secureconsulting.net] Sent: Fri 20-Mar-09 11:04 To:
> Goertzel, Karen [USA] Cc: Secure Code Mailing List Subject: Re:
> [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT)
> 
> So, what you're saying is that "security bugs" are really design
> flaws, assuming a perfect implementation of the design. Ergo,
> security bug is at best a misnomer, and at worst a fatal deficiency
> in design acumen.
> 
> :)
> 
> -ben
> 
> Goertzel, Karen [USA] wrote:
>> Except when they're hardware bugs. :)
>> 
>> I think the differentiation is also meaningful in this regard: I
>> can specify software that does non-secure things. I can implement
>> that software 100% correctly. Ipso facto - no software bugs. But
>> the fact remains that the software doesn't validate input because I
>> didn't specify it to validate input, or it doesn't encrypt
>> passwords because I didn't specify it to do so. I built to spec; it
>> just happened to be a stupid spec. So the spec is flawed - but the
>> implemented software conforms to that stupid spec 100%, so by
>> definition it not flawed. It is, however, non-secure.
>> 
>> -- Karen Mercedes Goertzel, CISSP Booz Allen Hamilton 703.698.7454 
>> goertzel_ka...@bah.com
>> 
>> 
>> 
>> 
>> -----Original Message----- From: sc-l-boun...@securecoding.org on
>> behalf of Benjamin Tomhave Sent: Thu 19-Mar-09 19:28 To: Secure
>> Code Mailing List Subject: Re: [SC-L] BSIMM: Confessions of a
>> Software Security Alchemist(informIT)
>> 
>> Why are we differentiating between "software" and "security" bugs?
>> It seems to me that all bugs are software bugs, ...
>> 
> 

-- 
Benjamin Tomhave, MS, CISSP
fal...@secureconsulting.net
LI: http://www.linkedin.com/in/btomhave
Blog: http://www.secureconsulting.net/
Photos: http://photos.secureconsulting.net/
Web: http://falcon.secureconsulting.net/

[ Random Quote: ]
Hofstadter's Law: "A task always takes longer than you expect, even when
you take into account Hofstadter's Law."
http://globalnerdy.com/2007/07/18/laws-of-software-development/
_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
_______________________________________________

Reply via email to