[SC-L] Economics of Software Vulnerabilities

2007-03-27 Thread McGovern, James F (HTSC, IT)
May I share another perspective.

1. The debate between open source vs. closed source in terms of security 
doesn't matter. Does anyone has any metrics that quantify the economics of 
writing better corporate software not for public consumption?

2. If you can't make the economic case, then you can possibly make the case of 
indexing yourself to others. I know folks opinion here in terms of keeping up 
with the Jones's but unless someone brainstorms a way for folks to do this, the 
economic case may never be made.

3. When one looks at metrics and more importantly maturity models, they almost 
always measure process and tend to avoid measuring either people and/or 
technology. If security folks figuring out how to measure people, process and 
technology then additional opportunities for secure coding practices may expose 
themselves.


*
This communication, including attachments, is
for the exclusive use of addressee and may contain proprietary,
confidential and/or privileged information.  If you are not the intended
recipient, any use, copying, disclosure, dissemination or distribution is
strictly prohibited.  If you are not the intended recipient, please notify
the sender immediately by return e-mail, delete this communication and
destroy all copies.
*


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-23 Thread security curmudgeon

On Wed, 21 Mar 2007, Steven M. Christey wrote:

:  With rare exceptions, in general, I do not find that the
:  open source community is that much more security consciousness
:  than those producing closed source. Certainly this seems true
:  if measured in terms of vulnerabilities and we measure across
:  the board (e.g., take a random sampling from SourceForge) and
:  not just our favorite security-related applications.
: 
: Indeed, CVE and any other refined vulnerability information source is 
: chock full of open source products on SourceForge that have the most 
: obvious security holes possible, and let's not forget the open source 
: products that have gotten a bad reputation such as PHP-Nuke and 
: Sendmail. Insecure programming is universal.

Belated, but i'd like to mimick Mr. Christey's comments here. For almost 
two decades, we've all heard or believed in the idea that open source is 
better than closed, because anyone can look at it. In theory, this is 
outstanding. In reality, this is a joke told at security conventions.

Just because people can look at a project in detail, doesn't mean they 
will. More to the point, just because people can, doesn't mean code 
auditing gurus will look at it.

If you consider projects like the Linux kernel, there are definitely a 
*lot* of coding ninjas involved. Despite that, we see a never ending 
stream of vulnerabilities (most local DoS attacks) being published. Does 
this mean the Linux Kernel developers are 
irresponsible/incompetant/lazy/whatever? Absolutely not. It only means 
that the notion that open source will be viewed by thousands of eyes was a 
nice pipe dream and talking point years back, not reality.
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-23 Thread David A. Wheeler
 On Wed, 21 Mar 2007, Steven M. Christey wrote:
 :  With rare exceptions, in general, I do not find that the
 :  open source community is that much more security consciousness
 :  than those producing closed source. Certainly this seems true
 :  if measured in terms of vulnerabilities and we measure across
 :  the board (e.g., take a random sampling from SourceForge) and
 :  not just our favorite security-related applications.

A random sampling from SourceForge will typically find the worst ones. 
Most OSS projects, like most proprietary projects, die due to lack of 
attention from _anyone_.

 : Indeed, CVE and any other refined vulnerability information source is 
 : chock full of open source products on SourceForge that have the most 
 : obvious security holes possible, and let's not forget the open source 
 : products that have gotten a bad reputation such as PHP-Nuke and 
 : Sendmail.

A well-deserved bad reputation, I might add, though I've been told that 
the latest versions of Sendmail are better.

 : Insecure programming is universal.
Absolutely.


security curmudgeon [EMAIL PROTECTED] piped in:
 Belated, but i'd like to mimick Mr. Christey's comments here. For almost 
 two decades, we've all heard or believed in the idea that open source is 
 better than closed, because anyone can look at it. In theory, this is 
 outstanding. In reality, this is a joke told at security conventions.
 
 Just because people can look at a project in detail, doesn't mean they 
 will. More to the point, just because people can, doesn't mean code 
 auditing gurus will look at it.
 
 ... the notion that open source will be viewed by thousands of eyes was a 
 nice pipe dream and talking point years back, not reality.

Nonsense.  Widespread review of _some_ OSS programs, by many eyes, _IS_ 
reality.   Just look at the evidence. There are a number of OSS projects 
where it's quite clear just by looking at the SCM records that many 
people _do_ review the code, both manually and by automated means.  The 
OpenBSD developers have been doing manual review for a long, long time, 
and their record of only 2 remote holes in 10 years is quite impressive. 
  Debian has a similar audit project as well.  (Both OpenBSD and Debian 
focus their efforts though... only SPECIFIC programs get reviewed, not 
stuff like chess games.)  There's a $500 bounty for finding 
vulnerabilities in Mozilla, and it's clear that many people are 
reviewing Mozilla Firefox's code specifically for security issues. 
There are now several projects that download OSS programs, review them 
through automated tools, and send back their results to the developers 
(DHS and Fortify back two such projects).  The claim that no OSS 
program gets lots of review is absolutely untrue.

On the other hand, it's nonsense that just because something is OSS 
means that (1) it's automatically secure or (2) it'll always be 
reviewed.  If _that_ is what you mean, then I completely agree with you. 
  Sendmail has had a terrible record - but Exchange is no saint either. 
I'd rather put my money on Postfix, which was specifically DESIGNED to 
be secure, as well as having review, than either of them.

I believe that you need to evaluate the security of OSS programs - or 
proprietary programs - on a case by case basis.   On that, I hope, we 
agree.  Any OSS program can in theory be reviewed, but only some get 
real review.  There are a number of specific OSS programs that do 
markedly better than their proprietary competition in terms of security 
- unsurprisingly, those tend to be the ones that HAVE received lots of 
review. Conversely, there are many OSS programs (and proprietary 
programs) that are absolute junk.  So look before you leap.

--- David A. Wheeler


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-23 Thread Gunnar Peterson
 Just because people can look at a project in detail, doesn't mean they
 will. More to the point, just because people can, doesn't mean code
 auditing gurus will look at it.
 

And sometimes, when they do look they get booted out of the project

http://www.heise-security.co.uk/news/82500

-gp


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-21 Thread McGovern, James F (HTSC, IT)
Kevin, I would love to see open source communities embrace secure coding 
practices with stronger assistance from software vendors in this space. This of 
course requires going beyond audit capability and figuring out ways to get 
the tools into developers hands.

As a contributor to open source projects, I struggle with introducing security 
as I already contribute my time with the support/blessing of my significant 
other but she wouldn't let me spend hard cash on tools for contributing to open 
source. I wish there was a better answer for us all in this seat.

Generally speaking, many of my peers outside of work contribute to open source 
with the rationale that it a safer place from a political perspective to try 
things out, kinda like a POC where the outcome doesn't have to be successful 
and it won't show up on your annual review. Lately, I haven't figured out how 
to reduce my own exposure...

-Original Message-
From: Wall, Kevin [mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 20, 2007 9:16 PM
To: McGovern, James F (HTSC, IT)
Cc: sc-l@securecoding.org
Subject: RE: [SC-L] Economics of Software Vulnerabilities


James McGovern apparently wrote...

 The uprising from customers may already be starting. It is 
 called open source. The real question is what is the duty of 
 others on this forum to make sure that newly created software 
 doesn't suffer from the same problems as the commercial 
 closed source stuff...

While I agree that the FOSS movement is an uprising, it:
1) it's being pushed by customers so much as IT developers
2) the uprising isn't so much as being an outcry against
   security as it is against not being able to have the
   desired features implemented in a manner desired.

At least that's how I see it.

With rare exceptions, in general, I do not find that the
open source community is that much more security consciousness
than those producing closed source. Certainly this seems true
if measured in terms of vulnerabilities and we measure across
the board (e.g., take a random sampling from SourceForge) and
not just our favorite security-related applications.

Where I _do_ see a remarkable difference is that the open source
community seems to be in general much faster in getting security
patches out once they are informed of a vulnerability. I suspect
that this has to do as much with the lack of bureaucracy in open
source projects as it does the fear of loss of reputation to their
open source colleagues.

However, this is just my gut feeling, so your gut feeling my differ.
(But my 'gut' is probably bigger than yours, so feeling prevails. ;-)
Does anyone have any hard evidence to back up this intuition. I
thought that Ross Anderson had done some research along those lines.

-kevin
---
Kevin W. Wall   Qwest Information Technology, Inc.
[EMAIL PROTECTED]   Phone: 614.215.4788
It is practically impossible to teach good programming to students
 that have had a prior exposure to BASIC: as potential programmers
 they are mentally mutilated beyond hope of regeneration
- Edsger Dijkstra, How do we tell truths that matter?
  http://www.cs.utexas.edu/~EWD/transcriptions/EWD04xx/EWD498.html 


This communication is the property of Qwest and may contain confidential or
privileged information. Unauthorized use of this communication is strictly 
prohibited and may be unlawful.  If you have received this communication 
in error, please immediately notify the sender by reply e-mail and destroy 
all copies of the communication and any attachments.


*
This communication, including attachments, is
for the exclusive use of addressee and may contain proprietary,
confidential and/or privileged information.  If you are not the intended
recipient, any use, copying, disclosure, dissemination or distribution is
strictly prohibited.  If you are not the intended recipient, please notify
the sender immediately by return e-mail, delete this communication and
destroy all copies.
*


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-21 Thread Arian J. Evans

Spot on thread, Ed:

On 3/20/07, Ed Reed [EMAIL PROTECTED] wrote:


Not all of these are consumer uprisings - some are, some aren't - but I
think they're all examples of the kinds of economic adjustments that occur
in mature markets.

   - Unsafe at any speed (the triumph of consumer safety over
   industrial laziness)


   - Underwriter Laboratories (the triumph of the fire insurance
   industry over shoddy electrical manufacturers)


   - VHS (vs BetaMax - the triumph of content over technology)



This is ironic to me, I wrote a paper for management types, upper tactical
to strategic level view of the software security problem. In current
incarnation it is called Unsafe at Any Speed. Besides a layman's breakdown
of the fundamental issues, (a) implementation issues almost entirely falling
under the inability to enforce data/function boundaries in modern
implementation level languages or platforms, and (b) functional issues which
are design/workflow, or emergent behavior related.

The important point I stress is that there really hasn't been a
Whistle-Blower Phase in the software industry concerning security. Today,
vague arguments about plane crashes aside, there is little to no hard
evidence tying software defects with security implications to loss of human
life. And that's the kicker: dollars to DNA, it's death that sells.

I also argue that we are killing the Canaries in the Coal Mine. The script
kiddies, the guys writing the little payload-less worms, the kid who wrote
the Sammy virus, they are scared to touch systems now. These were the
Canaries down there in our software coal mines. SQL Slammer, Witty worm,
though no payload, caused negative impact, but there were no charges for
these.

The charges are always some token young guy for some relatively benign worm.
MySpace slows down and we prosecute a young kid with above-average problem
solving skills. I used to call these worms that slowed things down free pen
tests, later canaries. They had a real (positive) value to us, and we've
killed that value without replacing it with something better.

I experienced a rising of vendor animosity and threats in the two years, a
reversing of trend back to the good old days, coupled with work
constraints restricting full disclosure options. What made this worse (to
me, ethically) is that many of these vendors were advertising security to
their clients, from an image of a Safe on the website with a list of
security features, to announcements proclaiming the security of the system
displayed to users after they log in. None of these systems were measurably
security in any fashion I could detect, not even to usual suspects (SQLi,
XSS, Insufficient Authorization/Workflow bypass, etc. etc.). I got the
feeling things were getting worse. That or I hit some weird biased sample of
ISVs.

I think you are on to something here in how to think about this subject.
Perhaps I should float my little paper out there and we could shape up
something worth while describing how the industry is evolving today.

I have been peacefully quiet since I quit my old job, ignoring the security
lists and industry and haven't poked the bear err trolled any of the usual
suspects lately. Looks like I've been missing out on some good dialogue,
thank you, this was very helpful,

Arian J. Evans
Solipsistic Software Security Sophist at Large
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-21 Thread Steven M. Christey

On Wed, 21 Mar 2007, mudge wrote:

 Sorry, but I couldn't help but be reminded of an old L0pht topic that
 we brought up in January of 1999. Having just re-read it I found it
 still relatively poignant: Cyberspace Underwriters Laboratories[1].

I was thinking about this, too, I should have remembered it in earlier
comments.  The fact that such a thing has NOT come to fruition seems to be
symptomatic of the industry, although there have been some partnerships
between commercial and non-commercial entities (e.g. Fortify and the Java
Open Review).

- Steve
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-21 Thread Steven M. Christey

I was originally going to say this off-list, but it's not that big a deal.

Arian J. Evans said:

 I think you are on to something here in how to think about this subject.
 Perhaps I should float my little paper out there and we could shape up
 something worth while describing how the industry is evolving today.

I've been wanting to do something along these lines but don't have much
time.  I'll gladly review it or provide suggestions.  I have a draft on
current disclosure practices that includes the diversity of researchers
and the role of vulnerability information providers.

- Steve
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-21 Thread mudge


On Mar 21, 2007, at 3:57 PM, Arian J. Evans wrote:

 Spot on thread, Ed:

 On 3/20/07, Ed Reed [EMAIL PROTECTED] wrote:
 Not all of these are consumer uprisings - some are, some aren't -  
 but I think they're all examples of the kinds of economic  
 adjustments that occur in mature markets.
 Unsafe at any speed (the triumph of consumer safety over  
 industrial laziness)
 Underwriter Laboratories (the triumph of the fire insurance  
 industry over shoddy electrical manufacturers)
 VHS (vs BetaMax - the triumph of content over technology)

Sorry, but I couldn't help but be reminded of an old L0pht topic that  
we brought up in January of 1999. Having just re-read it I found it  
still relatively poignant: Cyberspace Underwriters Laboratories[1].

It seems to me that a lot of what was of concern then is still of  
concern now and without great headway being made over these last 8  
years.

Some note-able items (warning, these are subjective and broad- 
stroked)  have been the commercial world eschewing TCSEC / Common  
Criteria[2], FIPS 140 being useful for some relatively niche areas  
and focusing on only portions of a device/component/code, and Trusted  
Computing really veering away from trusted computing platforms and  
codebases for classical security compartmentalization and instead  
focusing on DRM[3].

Just thinking out loud.

cheers,

.mudge

[1] http://packetstormsecurity.org/docs/infosec/cyberul.html
[2] often times due to requiring frameworks and configuration  
capabilities that end up not being used or too complicated for many  
people to customize.
[3] back to the thread topic somewhat... being economics based.

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-20 Thread ljknews
At 8:55 AM -0400 3/20/07, Michael S Hines wrote:
 I'm not sure what your sources are but from what I'm hearing and reading the
 problem is that there are many missing drivers for what have become standard
 peripherals that people are used to - and some of the vendors are reluctant
 to develop new drivers (the driver technology changed in Vista - so all
 drivers have to be reworked).
 
 MP3 players, ePhones, PDA's, etc. have become standard components in many
 places...  and they don't work with Vista - yet (if ever).

That is because the features provided by many add-on products depended on
the longstanding loose state of security on Microsoft Windows.

 It's the feature thing not that users are shunning security.
 
 And, at least to me, it is an indication that M$ did not understand the
 marketplace or rushed the (incomplete) product to market.  There's more than
 one way to foul up a new product launch.

The previous Microsoft mode had been to favor anything that would ease
feature implementation over anything that would provide security.
-- 
Larry Kilgallen
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-20 Thread Wall, Kevin
James McGovern apparently wrote...

 The uprising from customers may already be starting. It is 
 called open source. The real question is what is the duty of 
 others on this forum to make sure that newly created software 
 doesn't suffer from the same problems as the commercial 
 closed source stuff...

While I agree that the FOSS movement is an uprising, it:
1) it's being pushed by customers so much as IT developers
2) the uprising isn't so much as being an outcry against
   security as it is against not being able to have the
   desired features implemented in a manner desired.

At least that's how I see it.

With rare exceptions, in general, I do not find that the
open source community is that much more security consciousness
than those producing closed source. Certainly this seems true
if measured in terms of vulnerabilities and we measure across
the board (e.g., take a random sampling from SourceForge) and
not just our favorite security-related applications.

Where I _do_ see a remarkable difference is that the open source
community seems to be in general much faster in getting security
patches out once they are informed of a vulnerability. I suspect
that this has to do as much with the lack of bureaucracy in open
source projects as it does the fear of loss of reputation to their
open source colleagues.

However, this is just my gut feeling, so your gut feeling my differ.
(But my 'gut' is probably bigger than yours, so feeling prevails. ;-)
Does anyone have any hard evidence to back up this intuition. I
thought that Ross Anderson had done some research along those lines.

-kevin
---
Kevin W. Wall   Qwest Information Technology, Inc.
[EMAIL PROTECTED]   Phone: 614.215.4788
It is practically impossible to teach good programming to students
 that have had a prior exposure to BASIC: as potential programmers
 they are mentally mutilated beyond hope of regeneration
- Edsger Dijkstra, How do we tell truths that matter?
  http://www.cs.utexas.edu/~EWD/transcriptions/EWD04xx/EWD498.html 


This communication is the property of Qwest and may contain confidential or
privileged information. Unauthorized use of this communication is strictly 
prohibited and may be unlawful.  If you have received this communication 
in error, please immediately notify the sender by reply e-mail and destroy 
all copies of the communication and any attachments.

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-19 Thread Gary McGraw
Very interesting.  Crispin is in the throes of big software.  Anybody want to 
help me mount a rescue campaign from jamaica?

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com


 -Original Message-
From:   Crispin Cowan [mailto:[EMAIL PROTECTED]
Sent:   Mon Mar 19 16:00:48 2007
To: Gary McGraw
Cc: Ed Reed; sc-l@securecoding.org
Subject:Re: [SC-L] Economics of Software Vulnerabilities

Gary McGraw wrote:
 I'm not sure vista is bombing because of good quality.   That certainly would 
 be ironic.   

 Word on the way down in the guts street is that vista is too many things 
 cobbled together into one big kinda functioning mess.
I.e. it is mis-featured, and lacks on some integration. This is a
variation on not having desired features. And there certainly are big
features in Vista that were supposed to be there but aren't (most of
user-land being managed code, relational file system).

It is also infamously late.

So if the resources that were put into the code quality in Vista had
instead been put into features and ship-date, would it do better in the
marketplace?

Sure, that's heretical :) but it just might be true :(

Crispin, now believes that users are fundamentally what holds back security

-- 
Crispin Cowan, Ph.D.   http://crispincowan.com/~crispin/
Director of Software Engineering   http://novell.com
AppArmor Training at CanSec West   http://cansecwest.com/dojoapparmor.html






This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-19 Thread Crispin Cowan
Gary McGraw wrote:
 I'm not sure vista is bombing because of good quality.   That certainly would 
 be ironic.   

 Word on the way down in the guts street is that vista is too many things 
 cobbled together into one big kinda functioning mess.
I.e. it is mis-featured, and lacks on some integration. This is a
variation on not having desired features. And there certainly are big
features in Vista that were supposed to be there but aren't (most of
user-land being managed code, relational file system).

It is also infamously late.

So if the resources that were put into the code quality in Vista had
instead been put into features and ship-date, would it do better in the
marketplace?

Sure, that's heretical :) but it just might be true :(

Crispin, now believes that users are fundamentally what holds back security

-- 
Crispin Cowan, Ph.D.   http://crispincowan.com/~crispin/
Director of Software Engineering   http://novell.com
AppArmor Training at CanSec West   http://cansecwest.com/dojoapparmor.html

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-19 Thread Ed Reed
Crispin Cowan wrote:
 Crispin, now believes that users are fundamentally what holds back security

   
I was once berated on stage by Jamie Lewis for sounding like I was
placing the blame for poor security on customers themselves.

I have moved on, and believe, instead, that it is the economic
inequities - the mis-allocation of true costs - that is really to blame.

Vendors are getting better, because they're being shamed by publicity -
not because they're bearing more of the costs that users incur due to
shoddy software.

But as bad as the costs are that are born by users of shoddy software
(patch costs, loss of utility, denial of service, licenses for
anti-virus software to make up for the egregiously bad code that leaves
buffer overflow exploits available that anyone can leverage to take over
a system) - as bad as those costs are they're still swapped by the value
- increased productivity and adrenalin rush - that commercial
feature-ism delivers.

Add the slowly-warmed pot phenomenon (apocryphal as it may be) -
customers don't jump out of the boiling pot because they're too invested
to walk away.

Eventually I think they'll get fed up and there'll be a consumer uprising.

Until then let's encourage better coding practices and secure designs
and deep thought about what policy do I want enforced. 

(obligatory plug for high assurance)

But, let's not confuse code quality with code security, either.  It
isn't secure (against hostile code) until you can verify that it (a)
does what the policy says it should do (functional testing) and (b)
doesn't do what the security policy says it shouldn't do (fuzzing is
just a way of performing boundary tests on inputs - it tells you nothing
about hidden behaviors of the system, and you can't tell anything about
those without formal analysis and good life cycle configuration management).

Ed
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-19 Thread Crispin Cowan
Ed Reed wrote:
 Crispin Cowan wrote:
   
 Crispin, now believes that users are fundamentally what holds back security
   
 
 I was once berated on stage by Jamie Lewis for sounding like I was
 placing the blame for poor security on customers themselves.
   
Fight back harder. Jamie is wrong. The free market is full of product
offerings of every description. If users cared about security, they
would buy different products than they do, and deploy them different
than they do. QED, lack of security is user's fault.

 I have moved on, and believe, instead, that it is the economic
 inequities - the mis-allocation of true costs - that is really to blame.
   
Since many users are economically motivated, this may explain why users
don't care much about security :)

A competitive free-market economy is really a large optimization engine
for finding the most efficient way to do things, because the more
efficient enterprises crush the less efficient. As such, I have a fair
degree of faith that senior management is applying approximately the
right amount of security to mitigate the threat that they face. If they
are not doing so, they are at risk from competitors who do apply the
right amount of security.

What has made the security industry grow for the last decade has been
the huge growth in connectivity. That has grow the attack surface, and
hence the threat, that enterprises face. And that has caused enterprises
to grow the amount of security they deploy.

 Add the slowly-warmed pot phenomenon (apocryphal as it may be) -
 customers don't jump out of the boiling pot because they're too invested
 to walk away.

 Eventually I think they'll get fed up and there'll be a consumer uprising.
   
Why do you think it will be an uprising? Why not a gradual shift of the
vendors just get better, exactly as fast as the users need them to?

 Until then let's encourage better coding practices and secure designs
 and deep thought about what policy do I want enforced. 
   
Technologists figure out how to do stuff. Economists and strategists
figure out what to do. We can encourage all we want, but we are just
shouting into the wind until enterprise users demand better security.

Crispin
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-19 Thread Steven M. Christey

On Mon, 19 Mar 2007, Crispin Cowan wrote:

 Since many users are economically motivated, this may explain why users
 don't care much about security :)

But... but... but...

I understand the sentiment, but there's something missing in it.  Namely,
that the costs related to security are not really quantifiable yet, so
consumers are not working with the best information.  Then there's simple
lack of understanding, such as that exmplified by an individual consumer -
their computer gets really bogged down and slow, and they don't know
what's happening, so they go buy a new computer, when it was just a ton
of spyware from surfing habits that they didn't know were unsafe, or they
were running some zombie that was sucking up all their bandwidth for warez
distribution.

  Eventually I think they'll get fed up and there'll be a consumer uprising.
 
 Why do you think it will be an uprising? Why not a gradual shift of the
 vendors just get better, exactly as fast as the users need them to?

I really really wish for an uprising, but unfortunately I'm not too
optimistic right now.  Off the top of my head, I can't think of any
consumer uprisings in other industries, although the US' recent decline in
fuel-inefficient vehicles is sort of close.  Didn't some large
brick-and-mortar companies heavily criticize the software industry a
couple years ago?  I don't know how that played out.

- Steve
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-13 Thread Gary McGraw
In my opinion, though fuzz testing is certainly a useful technique (we've used 
it in hardware verification for years), any certification based solely on fuzz 
testing for security would be ludicrous.  Fuzz testing is not a silver bullet.

The biggest stumbling block for software certification is variability in final 
environment.

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com

 -Original Message-
From:   Gadi Evron [mailto:[EMAIL PROTECTED]
Sent:   Mon Mar 12 21:48:25 2007
To: Crispin Cowan
Cc: [EMAIL PROTECTED]; Ed Reed; sc-l@securecoding.org
Subject:Re: [SC-L] Economics of Software Vulnerabilities

On Mon, 12 Mar 2007, Crispin Cowan wrote:
 Ed Reed wrote:
  For a long time I thought that software product liability would
  eventually be forced onto developers in response to their long-term
  failure to take responsibility for their shoddy code.  I was mistaken. 
  The pool of producers (i.e., the software industry) is probably too
  small for such blunt economic policy to work.

 I'm not sure about the size of the pool. I think it is more about the
 amount of leverage that can be put on software:
 
 * It is trivial for some guy in a basement to produce a popular
   piece of open source software, which ends up being used as a
   controlling piece of a nuclear reactor, jet airplane, or
   automobile, and when it fails, $millions or $billions of damages
   result. The software author has no where near the resources to pay
   the damage, or even the insurance premiums on the potential damage.
 * In contrast, with physical stuff it is usually the case that the
   ability to cause huge damage requires huge capital in the first
   place, such as building nuclear reactors, jet planes, and cars.
 
 With this kind of leverage, the software producers don't have the
 resources to take responsibility, and so strict liability applied to
 authors reduces to don't produce software unless, possibly, you work
 for a very large corporation with deep pockets. Even then, corporate
 bean counters would likely prevent you from writing any software because
 the potential liability is so large.
 
  It appears, now, that producers will not be regulated, but rather users
  and consumers.  SOX, HIPAA, BASEL II, etc. are all about regulating
  already well-established business practices that just happen to be
  incorporating more software into their operations. 

 Much like the gun industry. Powerful, deadly tools that, if used
 inappropriately, can cause huge damage.

Indeed, and I found your posts enlightening.

Still, today an alternative presentsitself in the now more likely realm of
software security certification and testing. It has become more easier and
potentially regulated now that fuzzers have become:

1. Good enough.
2. Measurable.
3. Widely accessible.

Gadi.

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___





This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-13 Thread Gary McGraw
Hi crispy,

I'm not sure vista is bombing because of good quality.   That certainly would 
be ironic.   

Word on the way down in the guts street is that vista is too many things 
cobbled together into one big kinda functioning mess.  My bet is that Vista SP2 
will be a completely different beast.

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com
 

 -Original Message-
From:   Crispin Cowan [mailto:[EMAIL PROTECTED]
Sent:   Mon Mar 12 20:45:43 2007
To: Ed Reed
Cc: sc-l@securecoding.org
Subject:Re: [SC-L] Economics of Software Vulnerabilities

Ed Reed wrote:
 For a long time I thought that software product liability would
 eventually be forced onto developers in response to their long-term
 failure to take responsibility for their shoddy code.  I was mistaken. 
 The pool of producers (i.e., the software industry) is probably too
 small for such blunt economic policy to work.
   
I'm not sure about the size of the pool. I think it is more about the
amount of leverage that can be put on software:

* It is trivial for some guy in a basement to produce a popular
  piece of open source software, which ends up being used as a
  controlling piece of a nuclear reactor, jet airplane, or
  automobile, and when it fails, $millions or $billions of damages
  result. The software author has no where near the resources to pay
  the damage, or even the insurance premiums on the potential damage.
* In contrast, with physical stuff it is usually the case that the
  ability to cause huge damage requires huge capital in the first
  place, such as building nuclear reactors, jet planes, and cars.

With this kind of leverage, the software producers don't have the
resources to take responsibility, and so strict liability applied to
authors reduces to don't produce software unless, possibly, you work
for a very large corporation with deep pockets. Even then, corporate
bean counters would likely prevent you from writing any software because
the potential liability is so large.

 It appears, now, that producers will not be regulated, but rather users
 and consumers.  SOX, HIPAA, BASEL II, etc. are all about regulating
 already well-established business practices that just happen to be
 incorporating more software into their operations. 
   
Much like the gun industry. Powerful, deadly tools that, if used
inappropriately, can cause huge damage.

Use appropriately may be part of the key here. If you use your car
improperly and kill people as a result of e.g. your drunk driving, then
the car maker is not responsible. OTOH, if the design of your top-heavy
SUV combined with crappy tires results in rollovers, then courts do hold
the vendors responsible.

The problem with software: what is appropriate? Conceptually, that the
software in question has been sufficiently vetted for quality to justify
the risk involved. Efforts to do that kind of thing are used in select
industries (nukes and planes) but not widely, because the cost of
vetting is huge, so it only is used when the liabilities are huge.

Why? Because software metrics suck. 30 years of software engineering
research, and LOC is still arguably one of the best metrics of software
complexity, and there is almost nothing usable as a metric for software
quality.

It is not that no one has tried; lots of RD goes into software
engineering. Its not that there are no new ideas; lots of those abound.
Its not that there has been no advances in understanding; we know a lot
more about the problem than we used to.

I think it is just that it is a hard problem.

Software, by its nature, is vastly more complex per pound :) ^W^W per
unit person effort than any other artifact mankind has ever produced.
One developer in one month can produce a functional software artifact
that it would take a hundred people 10 years to verify as safe. With
those ratios, this problem will not fall easily.

 But as with other serious security policy formulations - the
 technology is irrelevant.  The policies, whether SOX or Multi-level
 Security, are intended to protect information of vital importance to the
 organization.  If technical controls are adequate to enforce them -
 fine.  If not, that in no way absolves the enterprise of the need to
 provide adequate controls.
   
Sure it does :) Just show that your organization performed due
diligence that is up to industry standards and the fact that you
failed pretty much does absolve you, in the eyes of the likes of SOX and
Basil.

It is a very interesting transition from trying to hold software vendors
liable to trying to hold deploying organizations liable, but this first
round of regulation looks like a sinecure for compliance consultants and
a few specialty vendors,and not much else.

 The computer software industry has lost its way.  It appears to be
 satisfied with prodding and encouraging software developers to develop
 some modicum

Re: [SC-L] Economics of Software Vulnerabilities

2007-03-13 Thread Gadi Evron
On Tue, 13 Mar 2007, Gary McGraw wrote:
 In my opinion, though fuzz testing is certainly a useful technique (we've 
 used it in hardware verification for years), any certification based solely 
 on fuzz testing for security would be ludicrous.  Fuzz testing is not a 
 silver bullet.

Fuzzing is indeed, most definitely, not a or the silver bullet, nor should
testing be based on itsolely. What it does provide us with is a measurable
fashion by which we can reliably test the:
1. Stability
2. Programming quality
3. Robustness

Of software, to a level which is much higher than employing several
reverse engineers and test engineers (not to say just examining
vulnerability history on the bugtraq archive).

Further, if not by certification, fuzzing has already shown it can
pressure companies to use software development lifecycle methodologies and
that way enforcing (encouraging?) better security with partners (read
Microsoft).

Fuzzing has also shown that it can be used to force vendors who sell to
you to indeed be tested by certain products (read large
Telcos). Although I am unsure if this approach holds water.

The re-emergence of this field beyond rubber stamp certifications or very
costly certifications, is something I see as very positive.

That, of course, if not a or the sulver bullet in any way, either, but
maybe we will see less input validation bugs around and will start facing
logical flaws that will boggle our minds.

Personal opinion: enough with buffer overflows already, no? :)

 The biggest stumbling block for software certification is variability in 
 final environment.

That makes sense, but I figure if we can eliminate some more by a factor
in our testing environment(s), all the better.

 gem

Gadi.

--
beepbeep it, i leave work, stop reading sec lists and im still hearing
gadi
- HD Moore to Gadi Evron on IM, on Gadi's interview on npr, March 2007.

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Economics of Software Vulnerabilities

2007-03-12 Thread Crispin Cowan
Ed Reed wrote:
 For a long time I thought that software product liability would
 eventually be forced onto developers in response to their long-term
 failure to take responsibility for their shoddy code.  I was mistaken. 
 The pool of producers (i.e., the software industry) is probably too
 small for such blunt economic policy to work.
   
I'm not sure about the size of the pool. I think it is more about the
amount of leverage that can be put on software:

* It is trivial for some guy in a basement to produce a popular
  piece of open source software, which ends up being used as a
  controlling piece of a nuclear reactor, jet airplane, or
  automobile, and when it fails, $millions or $billions of damages
  result. The software author has no where near the resources to pay
  the damage, or even the insurance premiums on the potential damage.
* In contrast, with physical stuff it is usually the case that the
  ability to cause huge damage requires huge capital in the first
  place, such as building nuclear reactors, jet planes, and cars.

With this kind of leverage, the software producers don't have the
resources to take responsibility, and so strict liability applied to
authors reduces to don't produce software unless, possibly, you work
for a very large corporation with deep pockets. Even then, corporate
bean counters would likely prevent you from writing any software because
the potential liability is so large.

 It appears, now, that producers will not be regulated, but rather users
 and consumers.  SOX, HIPAA, BASEL II, etc. are all about regulating
 already well-established business practices that just happen to be
 incorporating more software into their operations. 
   
Much like the gun industry. Powerful, deadly tools that, if used
inappropriately, can cause huge damage.

Use appropriately may be part of the key here. If you use your car
improperly and kill people as a result of e.g. your drunk driving, then
the car maker is not responsible. OTOH, if the design of your top-heavy
SUV combined with crappy tires results in rollovers, then courts do hold
the vendors responsible.

The problem with software: what is appropriate? Conceptually, that the
software in question has been sufficiently vetted for quality to justify
the risk involved. Efforts to do that kind of thing are used in select
industries (nukes and planes) but not widely, because the cost of
vetting is huge, so it only is used when the liabilities are huge.

Why? Because software metrics suck. 30 years of software engineering
research, and LOC is still arguably one of the best metrics of software
complexity, and there is almost nothing usable as a metric for software
quality.

It is not that no one has tried; lots of RD goes into software
engineering. Its not that there are no new ideas; lots of those abound.
Its not that there has been no advances in understanding; we know a lot
more about the problem than we used to.

I think it is just that it is a hard problem.

Software, by its nature, is vastly more complex per pound :) ^W^W per
unit person effort than any other artifact mankind has ever produced.
One developer in one month can produce a functional software artifact
that it would take a hundred people 10 years to verify as safe. With
those ratios, this problem will not fall easily.

 But as with other serious security policy formulations - the
 technology is irrelevant.  The policies, whether SOX or Multi-level
 Security, are intended to protect information of vital importance to the
 organization.  If technical controls are adequate to enforce them -
 fine.  If not, that in no way absolves the enterprise of the need to
 provide adequate controls.
   
Sure it does :) Just show that your organization performed due
diligence that is up to industry standards and the fact that you
failed pretty much does absolve you, in the eyes of the likes of SOX and
Basil.

It is a very interesting transition from trying to hold software vendors
liable to trying to hold deploying organizations liable, but this first
round of regulation looks like a sinecure for compliance consultants and
a few specialty vendors,and not much else.

 The computer software industry has lost its way.  It appears to be
 satisfied with prodding and encouraging software developers to develop
 some modicum of shame for the shoddy quality of their output.  Feed the
 beast, and support rampant featurism - its what's made so many people
 rich, after all.
   
The consumers who chose feature-rich over high-quality did that, not the
software industry.

 In the long run, though, featurism without quality is not sustainable. 
 That is certainly true, and I applaud efforts to encourage developers to
 rise up from their primordial ooze and embrace the next steps in sane
 programming (we HAVE largely stamped out self-modifying code, but
 strcpy() is still a problem...)
   
I beg to differ. There is no evidence at all that the good enough
modality is 

Re: [SC-L] Economics of Software Vulnerabilities

2007-03-12 Thread Gadi Evron
On Mon, 12 Mar 2007, Crispin Cowan wrote:
 Ed Reed wrote:
  For a long time I thought that software product liability would
  eventually be forced onto developers in response to their long-term
  failure to take responsibility for their shoddy code.  I was mistaken. 
  The pool of producers (i.e., the software industry) is probably too
  small for such blunt economic policy to work.

 I'm not sure about the size of the pool. I think it is more about the
 amount of leverage that can be put on software:
 
 * It is trivial for some guy in a basement to produce a popular
   piece of open source software, which ends up being used as a
   controlling piece of a nuclear reactor, jet airplane, or
   automobile, and when it fails, $millions or $billions of damages
   result. The software author has no where near the resources to pay
   the damage, or even the insurance premiums on the potential damage.
 * In contrast, with physical stuff it is usually the case that the
   ability to cause huge damage requires huge capital in the first
   place, such as building nuclear reactors, jet planes, and cars.
 
 With this kind of leverage, the software producers don't have the
 resources to take responsibility, and so strict liability applied to
 authors reduces to don't produce software unless, possibly, you work
 for a very large corporation with deep pockets. Even then, corporate
 bean counters would likely prevent you from writing any software because
 the potential liability is so large.
 
  It appears, now, that producers will not be regulated, but rather users
  and consumers.  SOX, HIPAA, BASEL II, etc. are all about regulating
  already well-established business practices that just happen to be
  incorporating more software into their operations. 

 Much like the gun industry. Powerful, deadly tools that, if used
 inappropriately, can cause huge damage.

Indeed, and I found your posts enlightening.

Still, today an alternative presentsitself in the now more likely realm of
software security certification and testing. It has become more easier and
potentially regulated now that fuzzers have become:

1. Good enough.
2. Measurable.
3. Widely accessible.

Gadi.

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


[SC-L] Economics of Software Vulnerabilities

2007-03-06 Thread Ed Reed
For a long time I thought that software product liability would
eventually be forced onto developers in response to their long-term
failure to take responsibility for their shoddy code.  I was mistaken. 
The pool of producers (i.e., the software industry) is probably too
small for such blunt economic policy to work.

Keep in mind that economics does have a tendency to balance out risk and
reward, and to fairly allocate liability.  But it takes time.  We're
only about 50 years into the life of the software industry, and we're
just starting to see regulatory notice that computers even exist.

It appears, now, that producers will not be regulated, but rather users
and consumers.  SOX, HIPAA, BASEL II, etc. are all about regulating
already well-established business practices that just happen to be
incorporating more software into their operations. 

But as with other serious security policy formulations - the
technology is irrelevant.  The policies, whether SOX or Multi-level
Security, are intended to protect information of vital importance to the
organization.  If technical controls are adequate to enforce them -
fine.  If not, that in no way absolves the enterprise of the need to
provide adequate controls.

The computer software industry has lost its way.  It appears to be
satisfied with prodding and encouraging software developers to develop
some modicum of shame for the shoddy quality of their output.  Feed the
beast, and support rampant featurism - its what's made so many people
rich, after all.

In the long run, though, featurism without quality is not sustainable. 
That is certainly true, and I applaud efforts to encourage developers to
rise up from their primordial ooze and embrace the next steps in sane
programming (we HAVE largely stamped out self-modifying code, but
strcpy() is still a problem...)

But that's not security.  It's just reducing irresponsible defects. 

For computer security to have any meaning, someone, some where, has to
say what is supposed to happen and what is not supposed to happen with
regard to access to information and resources of the system.  In other
words, there has to be a security policy.

If there's no way to articulate how the security policy can be enforced
by a system, i.e., no security model, then there's no real way to even
have a discussion about whether a system, much less individual
components of the system, contribute to or get in the way of enforcing
the security policy.

What's most disappointing to me is the near-total lack of discussion
about security policies and models in the whole computer security field,
today.

We're at about the 19th century level of sophistication of the practice
of medicine - we have a germ theory (bugs make you sick), but we're
still trying to get the doctors and nurses to wash their hands between
surgery (Doctor!  It HURTS when I do that! Then stop DOING that!). 
Better languages, better language skills, and better transparency
(disclosure) are all areas of important improvement.

The question I raise is this - will we return to a serious discussion
about whether and how computers can be used to secure the vital
information of enterprises before our industry reaches its first century
(say, by 2055)?  Ought a computer be able to be expected to apply
controls adequate to bet your life on, or not?  If so, when will that
discussion get started again?  It seems like the analytical approaches
that brought us Bell-LaPadula and similar models are considered
off-topic, today, but I haven't seen anything replace them as the basis
for a rational computer security discussion.

If engineering is the practice of applying the logic and proofs provided
by science to real world situations, software engineering and computer
science seem simply to have closed their eyes to the question of system
security and internal controls.

Perhaps economics will reinvigorate the discussion in the coming decades.

Ed Reed
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___