I think my thinking was a little different. I would also expect criteria that 
shows how it integrates into the entire lifecycle. For example, scanning source 
code that is already extracted is a little different than scanning a PVCS 
repository. Likewise, taking a list of vulnerabilities and understanding who 
created them, connecting to the identity stored in version control and then 
having the ability to feed tools such as JIRA, PVCS Tracker, etc would be 
powerful.

Good to see that folks are expanding the criteria in terms of what it scans 
for, but criteria as to how it integrates is also equally useful.

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Behalf Of Steven M. Christey
Sent: Tuesday, May 22, 2007 12:53 PM
To: McGovern, James F (HTSC, IT)
Cc: SC-L@securecoding.org
Subject: Re: [SC-L] Tools: Evaluation Criteria



On Tue, 22 May 2007, McGovern, James F (HTSC, IT) wrote:

> We will shortly be starting an evaluation of tools to assist in the
> secure coding practices initiative and have been wildly successful in
> finding lots of consultants who can assist us in evaluating but
> absolutely zero in terms of finding RFI/RFPs of others who have
> travelled this path before us. Would especially love to understand
> stretch goals that we should be looking for beyond simple stuff like
> finding buffer overflows in C, OWASP checklists, etc.

semi-spam: With over 600 nodes in draft 6, the Common Weakness Enumeration
(CWE) at http://cwe.mitre.org is the most comprehensive list of
vulnerability issues out there, and it's not just implementation bugs.
That might help you find other areas you want to test.  In addition, many
code analysis tool vendors are participating in CWE.

> In my travels, it "feels" as if folks are simply choosing tools in this
> space because they are the market leader, incumbent vendor or simply
> asking an industry analyst but none seem to have any "deep" criteria. I
> guess at some level, choosing any tool will move the needle, but
> investments really should be longer term.

Preliminary CWE analyses have shown a lot less overlap across the tools
than expected, so even baased on vulnerabilities tested, this is an
important consideration.

You might also want to check out the SAMATE project (samate.nist.gov),
which is working towards evaluation and understanding of tools, although
it's a multi-year program.

Finally, Network Computing did a tool comparison:


http://www.networkcomputing.com/article/printFullArticle.jhtml?articleID=198900460

- Steve
_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
_______________________________________________


*************************************************************************
This communication, including attachments, is
for the exclusive use of addressee and may contain proprietary,
confidential and/or privileged information.  If you are not the intended
recipient, any use, copying, disclosure, dissemination or distribution is
strictly prohibited.  If you are not the intended recipient, please notify
the sender immediately by return e-mail, delete this communication and
destroy all copies.
*************************************************************************


_______________________________________________
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
_______________________________________________

Reply via email to