Fascinating and heartening development. Raises a couple of questions in my mind.

1. Why now? Many worthies, myself included during my years at Sun, have been crying for years/decades *from within the software industry* for just such a shift. So what has changed? Ken and I outlined in "Secure Coding" the economic forces that have militated against security quality. These forces still operate, I feel sure. The "Tragedy of the Commons", for example, is never going to be repealed. So what accounts for this shift, which I agree is happening, without (as I have so often predicted) the dramatic airliner-trapped-in-the-sky/girl-trapped-in-the-well TV moment catalyzing Congress critters into knee-jerk legislation?

The significant enabling event coming over the horizon seems to be the development of commerical quality and well-marketed tools.

Can it be that capitalism, which (more or less) created the problem, will also lead to its resolution? Perhaps. I have argued elsewhere that "unsecure" behavior like writing bad code is analogous to polluting the Internet. (I have proposed that "unsecurity credits", operating like pollution credits, be used by enterprises to cause departments to budget risk as they today budget other resources.) So maybe we are seeing the birth of entrepeneurial cyber-environmentalism. Has it passed through a stage of being the concern of "cranks" (us, I mean, esteeemed fellow travelers), to a "niche" concern, to be followed by being "trendy", then mainstream, and so on? Can we hope to live long enough to be condescended to as being passionate about only something "everybody knows" is dangerous?

2. What is the proper role of government to encourage/foster/exploit such a development? I take it for granted that, as the world's largest (I think) software customer, the U.S. federal government ought to show preference for products built using such tools, and that as the primary overseer of publicy traded North American companies, ought to require, via SEC rules, their internal use by such companies. I (with others) testified in this sense years ago. But let's take another look now at the question of security quality *metrics* and *standards*. As this group have often discussed, it's tough to envision. No more than 1 bug per thousand lines? Must withstand attacks from four high school students for three hours? Able to protect for 24 hours an encrypted Swiss Bank Account worth a million dollars on a site accessible from the World Wide Web? Beats the heck out of me. But my question: what can and should government do, now that tools are emerging, to help us move toward measurement and standards? It happens that NIST (that's the U.S. National Institute of Standards and Technology) has a modest effort starting up to look into the state of the art of static checkers and so forth. I'm not competent to state here what the goals are, or should be, of NIST's current and future efforts should be. So I ask the group: does the advent (as it appears) of effective and easy-to-use tools mean that Now is The Time to push for Standards? If so, who but we "cyber-environmentalists" can lead the effort? And what's the next step?

-mg-

p.s. I apologize, btw, if my meanderings above recapitulate annoyingly threads here I have missed while attending to Other Concerns. ----- Original Message ----- From: <[EMAIL PROTECTED]>
To: <sc-l@securecoding.org>
Sent: Saturday, May 06, 2006 9:00 AM
Subject: SC-L Digest, Vol 2, Issue 69


Send SC-L mailing list submissions to
sc-l@securecoding.org

To subscribe or unsubscribe via the World Wide Web, visit
Message: 1
Date: Fri, 5 May 2006 13:15:52 -0400
From: "Kenneth R. van Wyk" <[EMAIL PROTECTED]>
Subject: [SC-L] WSJ.com - Tech Companies Check Software Earlier for
Flaws
To: Secure Coding <SC-L@securecoding.org>
Message-ID: <[EMAIL PROTECTED]>
Content-Type: text/plain; charset="us-ascii"

I saw an interesting Wall Street Journal article today that talks about
companies adopting software security practices. Complete story can be found
at:

http://online.wsj.com/public/article/SB114670277515443282-B59kll7qXrkxOXId1uF0txp8NFs_20070504.html?

The article cites a couple of companies that are starting to seriously use
some static code analysis tools (Coverity and Fortify) to scan their src
trees for security defects. Although it doesn't address much in the way of
design-time security activities, it's a good start and it's encouraging to
see this sort of coverage in mainstream media.

I really liked this quote - "In effect, software makers are now admitting that
their previous development process was faulty. While banks and other
companies that deal with sensitive customer data began to build security into
software development in the late 1990s, Microsoft Corp. and other software
makers are only now in the middle of revamping their software-writing
processes. "

Cheers,

Ken van Wyk
--
KRvW Associates, LLC
http://www.KRvW.com

_______________________________________________
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php

Reply via email to