Re: [SC-L] Re: Comparing Scanning Tools (false positives)

2006-06-13 Thread David A. Wheeler

Crispin Cowan wrote:

I would like to introduce you to my new kick-ass scanning tool. You run
it over your source code, and it only produces a single false-positive
for you to check out. That false positive just happens to be the
complete source code listing for your entire program :)



If you can guarantee it is a false positive, this is a very useful tool 
indeed :-)


Indeed.  Unfortunately, there seems to be a distinct shortage of software
that will trigger the false positive :-) :-).

--- David A. Wheeler




___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


Re: [SC-L] Re: Comparing Scanning Tools (false positives)

2006-06-13 Thread Johan Peeters



Crispin Cowan wrote:

David A. Wheeler wrote:


Brian Chess (brian at fortifysoftware dot com) said:


False positives:
Nobody likes dealing with a pile of false positives, and we work hard to
reduce false positives without giving up potentially exploitable
vulnerabilities.


I think everyone agrees that there are "way too many false positives"
in the sense that "there are so many it's annoying and it costs money
to check them out" in most of today's tools.

But before you say "tools are useless" you have to ask, "compared to
what?"
Manual review can find all sorts of things, but manual review is likely
to miss many serious problems too.  ESPECIALLY if there are only a
few manual reviewers for a large codebase, an all-too-common situation.


I would like to introduce you to my new kick-ass scanning tool. You run
it over your source code, and it only produces a single false-positive
for you to check out. That false positive just happens to be the
complete source code listing for your entire program :)


If you can guarantee it is a false positive, this is a very useful tool 
indeed :-)


kr,

Yo

--
Johan Peeters
program director
http://www.secappdev.org
+32 16 649000
___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


Re: [SC-L] Re: Comparing Scanning Tools (false positives)

2006-06-13 Thread David A. Wheeler

Gary McGraw wrote:

Hi all (especially david),

The story you repeated about ITS4 finding a vulnerability

> "that can't happen" is wrong.


The tool FIST (a fault injection tool for security) which we decribed

> in an Oakland paper from 1998 was what you were thinking of.
> (FIST was also produced at cigital...the paper was by anup ghosh,
> tom o'connor, and myself.). FIST found a vulnerbility that we could not
> figure out how to exploit.  Some 6 months later, a security researcher
> figured out how and published the sploit.

Ah! That explains why I couldn't find it.  Right basic story, and right
company... but wrong tool.  Thanks for the correction.

I think it's a very good cautionary tale, and not everyone's
heard it.  Could you post a little more information about that
here, with citations (URLs where possible)?  I believe a preprint
of the FIST paper you mean is here, correct?:
 http://www.cigital.com/papers/download/ieees_p98_2col.pdf


--- David A. Wheeler


___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


Re: [SC-L] RE: Comparing Scanning Tools

2006-06-13 Thread Michael Mucha

I've been pushing contractual requirements for ISVs at work (academic medical 
center with a $1B+ revenue hospital), particularly in the lengthy negotiations 
last winter with our new clinical information system vendor (the software 
license alone will cost us about $100M).  

In a nutshell:
-  "What secure coding practices do you use in your development process, 
e.g. source control, code reviews, use of static analysis tools, preferred 
libraries, training, a/v scanning on the gold master, etc?"
-  "huh?"
- After about 5 hours of this spread over 3 negotiating sessions, as part of 
months of overall negotiations, I eventually had to give up on the issue 
because the $100M train was leaving the barn with or without my requirements, 
and the vendor wasn't willing to concede more than "our software is compatible 
with your Symantec A/V". 

The good news is that coworkers now regularly come to me during vendor 
selection to ask about security requirements for contract negotiations, and 
we've succeeded in getting security provisions added to more recent contracts, 
but they haven't been in the code assurance area ( e.g. "vendor agrees to add 
AD auth support" and "vendor agrees their software meets HIPAA regulations 
regarding electronic signatures" ). Next time I'll start beating the drum 
earlier with my coworkers so that the issue can be placed at a higher priority, 
with more people pushing on the vendor. Things creep forward...

I see from the previously-posted http://news.com.com/2100-1002_3-5220488.html 
that Ounce Labs is trying to push it along:
"announced on Tuesday that it had created a boilerplate contract addendum that 
holds software makers responsible for guaranteeing the security of their 
software." 


On Fri, Jun 09, 2006 at 02:32:16PM -0400, Jeremy Epstein wrote:
> panel session where representatives from a couple of companies not in the
> software/technology business claimed that they're making contractual
> requirements in this area (i.e., that vendors are required to assert as part
> of the contract what measures they use to assure their code).  So I guess
> there's proof by construction that companies other than Microsoft & Oracle
> care.
>  
___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


RE: [SC-L] Re: Comparing Scanning Tools (false positives)

2006-06-13 Thread Gary McGraw
Hi all (especially david),

The story you repeated about ITS4 finding a vulnerability "that can't happen" 
is wrong.  

The tool FIST (a fault injection tool for security) which we decribed in an 
Oakland paper from 1998 was what you were thinking of.  (FIST was also produced 
at cigital...the paper was by anup ghosh, tom o'connor, and myself.). FIST 
found a vulnerbility that we could not figure out how to exploit.  Some 6 
months later, a security researcher figured out how and published the sploit.

So david's point stands...but FIST was way less stupid that ITS4.

In my opinion, the first generation tools ITS4, RATS, and flawfinder should all 
be abandonned immediately for the new generation of commercial tools.

Not using tools today borders on negligence...and I will be happy to say that 
in court when the time comes.  As I have said before...the lawyers are on the 
beach.

gem
www.cigital.com
www.swsec.com

 -Original Message-
From:   David A. Wheeler [mailto:[EMAIL PROTECTED]
Sent:   Mon Jun 12 19:33:52 2006
To: sc-l@securecoding.org
Subject:[SC-L] Re: Comparing Scanning Tools (false positives)

I'd like to follow up on Brian Chess' comments...


Brian Chess (brian at fortifysoftware dot com) said:
> False positives:
> Nobody likes dealing with a pile of false positives, and we work hard to
> reduce false positives without giving up potentially exploitable
> vulnerabilities.

I think everyone agrees that there are "way too many false positives"
in the sense that "there are so many it's annoying and it costs money
to check them out" in most of today's tools.

But before you say "tools are useless" you have to ask, "compared to what?"
Manual review can find all sorts of things, but manual review is likely
to miss many serious problems too.  ESPECIALLY if there are only a
few manual reviewers for a large codebase, an all-too-common situation.

Today's tools have a very large set of problems.  But if you look
at the trendlines of the amount of software that people are using,
you'll notice that it is increasing exponentially.   That is unsustainable
for purely manual review approaches, at least as the ONLY approach.
We can either drastically cut the amount of software
(easing review) or use tools -- those are really our only choices.
Reducing the amount of software that needs review is MUCH better
security-wise; if you can do that, DO THAT.  But I think that's
unlikely to occur (or be enough) in many circumstances,
so we need an alternative than crossing our fingers.

I think a sense of perspective is important.  Yes, tools aren't perfect,
but are they better than your alternatives?  Also, tools will
get better as they are used.  I expect that tools will be
refined as they are used in the field (or lose out to better tools).

> In some sense, this is where security tools get the raw end of the deal.  If
> you're performing static analysis in order to find general quality problems,
> you can get away with dropping a potential issue on the floor as soon as you
> get a hint that your analysis might be off.  You can't do that if you are
> really focused on security

To compensate, many tools use "risk levels" to try to give an
approximate sense of what to look at first.  But the problem is still
the same, tools often cannot be CERTAIN that a construct is a vulnerability,
yet if you throw it away, you might have thrown away reporting on
the most important vulnerability.

> Compounding the problem is that, when the static analysis tool does point
> you at an exploitable vulnerability, it's often not a very memorable
> occasion.  It's just a little goof-up in the code...


Yes. I'll add that often people aren't even certain it IS a
security vulnerability; the analysis to determine if something is a
vulnerability or not may take longer than simply "cleaning up" the code.

Although it's old, the paper on ITS4 is still interesting
(it won the "best paper" award at the time):
  http://www.acsac.org/2000/papers/78.pdf
ITS4 is about as simple/naive a tool as it's possible to usefully implement
(the same is true for RATS and flawfinder, which use the same technique).
But I think the following statements about tools are still true, even
for the more sophisticated tools:
* it still takes time to do analysis (though tools reduce it)
* tools still require expertise to use (particularly in understanding
   the answers and determining if it indicates a real problem)
* tools CAN be helpful in finding real security vulnerabilities.

IIRC, ITS4 once found a vulnerability, the researchers said
"that can't happen" and later they discovered it COULD happen.
I don't remember where I saw that.  The OpenBSD folks have this right,
I think: it is often better to change code to be CERTAIN that it
doesn't have a vulnerability, instead of wasting lengthy efforts
to determine if there's a code path that can be exploited.
It's easy to miss a suprising code path, and even if it's impossible
today, a "trivial" mainte

Re: [SC-L] Re: Comparing Scanning Tools (false positives)

2006-06-13 Thread Crispin Cowan
David A. Wheeler wrote:
> Brian Chess (brian at fortifysoftware dot com) said:
>> False positives:
>> Nobody likes dealing with a pile of false positives, and we work hard to
>> reduce false positives without giving up potentially exploitable
>> vulnerabilities.
> I think everyone agrees that there are "way too many false positives"
> in the sense that "there are so many it's annoying and it costs money
> to check them out" in most of today's tools.
>
> But before you say "tools are useless" you have to ask, "compared to
> what?"
> Manual review can find all sorts of things, but manual review is likely
> to miss many serious problems too.  ESPECIALLY if there are only a
> few manual reviewers for a large codebase, an all-too-common situation.
I would like to introduce you to my new kick-ass scanning tool. You run
it over your source code, and it only produces a single false-positive
for you to check out. That false positive just happens to be the
complete source code listing for your entire program :)

Crispin

-- 
Crispin Cowan, Ph.D.  http://crispincowan.com/~crispin/
Director of Software Engineering, Novell  http://novell.com


___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php