[SC-L] Application Security Debt and Application Interest Rates

2011-03-06 Thread Chris Wysopal

I have a couple of blog posts modeling application vulnerabilities the way you 
might think of technical debt.

Part I: Application Security Debt and Application Interest Rates
http://www.veracode.com/blog/2011/02/application-security-debt-and-application-interest-rates/

Part II: A Financial Model for Application Security Debt
http://www.veracode.com/blog/2011/03/a-financial-model-for-application-security-debt/

-Chris

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] InformIT: comparing static analysis tools

2011-02-04 Thread Chris Wysopal

Uploading code isn't an issue with software vendors because we are analyzing 
the artifact that they ship to their customer anyway; the executable version of 
their software, not source code.  Unless of course the executable is source 
code which is the case for JSP or PHP, and other scripting languages but they 
are shipping that to their customer so why not send it to us.

If it is an enterprise app that never leaves the four walls of the business 
then the business has to look at our independent Systrust certification from 
EY, our independent penetration test results, our employee background checks 
and our NDAs and decide whether it is worth the risk.  For 11 of the top 25 
banks in the world we have passed this test.  We have had due diligence teams 
from 3 letter agencies and Fortune 50 companies come and kick our tires and we 
have never failed to pass this test. Our environment is designed so our 
customers IP, their executables, is only decrypted on an engine analysis 
machine for the duration of the analysis.

Veracode was founded by security people.  We are a security company.  I think 
this shows through in everything we do.

-Chris 

-Original Message-
From: Jim Manico [mailto:jim.man...@owasp.org] 
Sent: Thursday, February 03, 2011 7:02 PM
To: Chris Wysopal
Cc: Gary McGraw; Secure Code Mailing List
Subject: Re: [SC-L] InformIT: comparing static analysis tools

Chris,

I've tried to leverage Veracode in recent engagements. Here is how the 
conversation went:

Jim:
Boss, can I upload all of your code to this cool SaaS service for analysis?

Client:
Uh no, and next time you ask, I'm having you committed.

I'm sure you have faced these objections before. How do you work around them?

-Jim Manico
http://manico.net

On Feb 3, 2011, at 1:54 PM, Chris Wysopal cwyso...@veracode.com wrote:

 
 Nice article.  In the 5 years Veracode has been selling static analysis 
 services we have seen the market mature.  In the beginning, organizations 
 were down in the weeds. What false positive rate or false negative rate does 
 the tool/service have over a test suite such as SAMATE.  Then we saw a move 
 up to looking at the trees.  Did the tool/service support the Java 
 frameworks I am using?  Now we are seeing organizations look at the forest. 
 Can I scale static analysis effectively over all my development sites, my 
 outsourcers, and vendors?  This is a good sign of a maturing market.
 
 It is my firm belief that software security has a consumption problem.  
 We know what the defects are.  We know how to fix them.  We even have 
 automation for detecting a lot of them.  The problem is getting the 
 information and technology to the right person at the right time 
 effectively and managing an organization-wide program.  This is the 
 next challenge for static analysis. bias-alertI think SaaS based 
 software is more easily consumed and this isn't any different for 
 software security/bias-alert
 
 -Chris
 
 -Original Message-
 From: sc-l-boun...@securecoding.org 
 [mailto:sc-l-boun...@securecoding.org] On Behalf Of Gary McGraw
 Sent: Wednesday, February 02, 2011 9:49 AM
 To: Secure Code Mailing List
 Subject: [SC-L] InformIT: comparing static analysis tools
 
 hi sc-l,
 
 John Steven and I recently collaborated on an article for informIT.  The 
 article is called Software [In]security: Comparing Apples, Oranges, and 
 Aardvarks (or, All Static Analysis Tools Are Not Created Equal) and is 
 available here:
 http://www.informit.com/articles/article.aspx?p=1680863
 
 Now that static analysis tools like Fortify and Ounce are hitting the 
 mainstream there are many potential customers who want to compare them and 
 pick the best one.  We explain why that's more difficult than it sounds at 
 first and what to watch out for as you begin to compare tools.  We did this 
 in order to get out in front of test suites that purport to work for tool 
 comparison.  If you wonder why such suites may not work as advertised, read 
 the article.
 
 Your feedback is welcome.
 
 gem
 
 company www.cigital.com
 podcast www.cigital.com/silverbullet
 blog www.cigital.com/justiceleague
 book www.swsec.com
 
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org List 
 information, subscriptions, etc - 
 http://krvw.com/mailman/listinfo/sc-l
 List charter available at - 
 http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as 
 a free, non-commercial service to the software security community.
 Follow KRvW Associates on Twitter at: 
 http://twitter.com/KRvW_Associates
 ___
 
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org List 
 information, subscriptions, etc - 
 http://krvw.com/mailman/listinfo/sc-l
 List charter available at - 
 http://www.securecoding.org/list/charter.php
 SC-L is hosted

Re: [SC-L] InformIT: comparing static analysis tools

2011-02-04 Thread Chris Wysopal

Many of  traditional benefits of SaaS: no software to install, scaling from 
group to enterprise, and ease of central management, make it easier to roll out 
and manage software security programs enterprise wide.  The bigger and more 
diverse an organization is the more these “consumption” benefits kick in.

-Chris

From: Prasad N Shenoy [mailto:prasad.she...@gmail.com]
Sent: Thursday, February 03, 2011 9:02 PM
To: Chris Wysopal
Cc: Gary McGraw; Secure Code Mailing List
Subject: Re: [SC-L] InformIT: comparing static analysis tools

Very well said Chris. Can you explain what you mean by . bias-alertI think 
SaaS based software is more easily consumed and this isn't any different for 
software security/bias-alert

Sent from my iPhone

On Feb 3, 2011, at 2:54 PM, Chris Wysopal 
cwyso...@veracode.commailto:cwyso...@veracode.com wrote:
. bias-alertI think SaaS based software is more easily consumed and this 
isn't any different for software security/bias-alert
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] [WEB SECURITY] Re: Backdoors in custom software applications

2010-12-17 Thread Chris Wysopal

Here is a paper that I wrote with Chris Eng that covers major categories of 
backdoors with examples.

http://www.veracode.com/images/stories/static-detection-of-backdoors-1.0.pdf

Our Blackhat presentation

http://www.veracode.com/images/stories/static-detection-of-backdoors-1.0-blackhat2007-slides.pdf

-Chris

-Original Message-
From: Jeremy Epstein [mailto:jeremy.j.epst...@gmail.com] 
Sent: Thursday, December 16, 2010 6:10 PM
To: Sebastian Schinzel
Cc: Secure Coding; websecurity
Subject: [WEB SECURITY] Re: [SC-L] Backdoors in custom software applications

There was an interesting example in a NPS thesis about a decade ago introducing 
a back door into a device driver.  I can't remember the student's name, 
unfortunately.  Phil something-or-other.

On Thu, Dec 16, 2010 at 3:18 PM, Sebastian Schinzel s...@seecurity.org wrote:
 Hi all,

 I am looking for ideas how intentional backdoors in real software 
 applications may look like.

 Wikipedia already provides a good list of backdoors that were found in 
 software applications: 
 http://en.wikipedia.org/wiki/Backdoor_(computing)

 Has anyone encountered backdoors during code audits, penetration tests, data 
 breaches?
 Could you share some details of how the backdoor looked like? I am 
 really interested in a technical and abstract description of the backdoor 
 (e.g. informal descriptions or pseudo-code).
 Anonymized and off-list replies are also very welcome.

 Thanks,
 Sebastian
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org List 
 information, subscriptions, etc - 
 http://krvw.com/mailman/listinfo/sc-l
 List charter available at - 
 http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC 
 (http://www.KRvW.com) as a free, non-commercial service to the software 
 security community.
 Follow KRvW Associates on Twitter at: 
 http://twitter.com/KRvW_Associates
 ___



Join us on IRC: irc.freenode.net #webappsec

Have a question? Search The Web Security Mailing List Archives: 
http://www.webappsec.org/lists/websecurity/archive/

Subscribe via RSS: 
http://www.webappsec.org/rss/websecurity.rss [RSS Feed]

To unsubscribe email websecurity-unsubscr...@webappsec.org and reply to the 
confirmation email

Join WASC on LinkedIn
http://www.linkedin.com/e/gis/83336/4B20E4374DBA

WASC on Twitter
http://twitter.com/wascupdates


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] informIT: Technology transfer

2010-10-29 Thread Chris Wysopal

I didn't realize you credited SLINT in the ITS4 paper.  Very cool.  It isn't 
often that the academic world credits non-academic research and vice versa.  It 
is one of my pet peeves of the security research community[1].

SLINT scanned source code. It was born out of how we saw black hats doing 
automated code reviews with scripts that drove grep in interesting ways.  The 
black hat is scanning lots of software source code looking for *any* 
exploitable bug.  A different problem than the white hat needing to find *all* 
exploitable bugs in a particular piece of code.

I find it interesting that the first static analysis came out of the desire to 
find bugs to exploit software.  Same goes for fuzzing if you look at that 
technology history.  White hats have had to vastly improve and productize these 
techniques lest black hats run rampant using even inferior tools due to the 
all bugs vs. anybug effect.

-Chris


1. Standing on Other's Shoulders, https://www.securityfocus.com/columnists/486

-Original Message-
From: Gary McGraw [mailto:g...@cigital.com] 
Sent: Thursday, October 28, 2010 5:08 PM
To: Secure Code Mailing List
Cc: Jeremy Epstein; Chris Wysopal
Subject: Re: [SC-L] informIT: Technology transfer

Weld is correct about SLINT which did predate ITS4.  We also created a tool 
called Jslint which even borrowed the slint name from what was then the l0pht 
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?isNumber=19003arNumber=877869isnumber=19003arnumber=877869
 (sorry, I don't seem to have a free link to that ancient paper even on the 
Cigital complete list at http://www.cigital.com/papers/).

Back then from what I recall, slint did a basic binary scan.  ITS4 and Jslint, 
on the other hand, were scanning source code.  But the notion of looking for 
vulnerabilities statically was the important bit.  Sorry for the oversight 
Weld, it was not intentional!  (In fact, look through the ITS4 paper's refs...)

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com


On 10/28/10 3:26 PM, Jeremy Epstein jeremy.j.epst...@gmail.com wrote:

The ITS4 article can be found at
http://www.acsac.org/2000/abstracts/78.html - it won the best paper
award when it was presented in 2000.  (I don't think SLINT was every
presented at a professional conference.)

And since I'm mentioning ACSAC, the deadline for early registration is
coming up on Nov 11 - some really fascinating papers this year, that
maybe you'll be discussing 10 years from now ;-).  It's at the Four
Seasons in Austin Dec 6-10 (and hotel rooms are only $104!)

--Jeremy

On Thu, Oct 28, 2010 at 3:04 PM, Chris Wysopal cwyso...@veracode.com wrote:

 Nice article.  There is a piece of this history that predated ITS4 which is 
 L0pht's SLINT which was in 1998 and demoed to you and John Viega.

 Here was our original description:

 http://web.archive.org/web/19990209122838/http://www.l0pht.com/slint.html

 From the Feb, 1999 web page:

 excerpt

 Source code security analyzers are publicly available in the black hat 
 community and are being used to scan for exploitable code. SLINT will help 
 you render the PD wares obsolete.

 What is it?
 SLINT is a core product to be sold into an existing GUI development package.
 - Helps people be proactive while writing secure code by highlighting 
 positional hot spots of exploitable routines and poor memory allocations.
 - Identifies suspect blocks of code.
 - Makes the task of security review more palatable so you don't need 
 a team of high-level experts to go through megabytes of code.
 - Supplies solutions and/or alternatives to problem areas.
 - Most security problems could have been fixed at the beginning of 
 development. Secure applications must start with a secure base. The Best 
 *BANG* for the buck is to be proactive at the start of program creation
 - Easy to implement into existing Y2K code review packages

  What will it examine and on what platforms?

 - Unix/NT
 - C, C++ (JAVA in the future)
 - elf-32 binaries
 - a.out files
 - buffer overflows
 - improper SetUID of files
 - randomness code faults
 - race conditions
 - incorrect access of memory
 - improper flags on critical system calls
 - more?

 /excerpt

 Sounds very familiar. It is almost hard to believe that was 12 years ago.

 SLINT in turn grew out of the black hat community so I won't claim that L0pht 
 had this idea first, just that we took it to the consultingware level.  I 
 like that term because I lived it with SLINT at L0pht and then UnDeveloper 
 Studio at @stake which has become the commercial static code analysis service 
 at Veracode.  Our technology at Veracode followed a similar track that the 
 Cigital to Fortify to HP technology has.

 -Chris

 -Original Message-
 From: sc-l-boun...@securecoding.org [mailto:sc-l-boun

Re: [SC-L] informIT: Technology transfer

2010-10-28 Thread Chris Wysopal

Nice article.  There is a piece of this history that predated ITS4 which is 
L0pht's SLINT which was in 1998 and demoed to you and John Viega.

Here was our original description:

http://web.archive.org/web/19990209122838/http://www.l0pht.com/slint.html

From the Feb, 1999 web page:

excerpt

Source code security analyzers are publicly available in the black hat 
community and are being used to scan for exploitable code. SLINT will help you 
render the PD wares obsolete.

What is it?
SLINT is a core product to be sold into an existing GUI development package.
 - Helps people be proactive while writing secure code by highlighting 
positional hot spots of exploitable routines and poor memory allocations.
 - Identifies suspect blocks of code.
 - Makes the task of security review more palatable so you don't need a 
team of high-level experts to go through megabytes of code.
 - Supplies solutions and/or alternatives to problem areas.
 - Most security problems could have been fixed at the beginning of 
development. Secure applications must start with a secure base. The Best *BANG* 
for the buck is to be proactive at the start of program creation
 - Easy to implement into existing Y2K code review packages

  What will it examine and on what platforms?

 - Unix/NT
 - C, C++ (JAVA in the future)
 - elf-32 binaries
 - a.out files
 - buffer overflows
 - improper SetUID of files
 - randomness code faults
 - race conditions
 - incorrect access of memory
 - improper flags on critical system calls
 - more?

/excerpt

Sounds very familiar. It is almost hard to believe that was 12 years ago.

SLINT in turn grew out of the black hat community so I won't claim that L0pht 
had this idea first, just that we took it to the consultingware level.  I 
like that term because I lived it with SLINT at L0pht and then UnDeveloper 
Studio at @stake which has become the commercial static code analysis service 
at Veracode.  Our technology at Veracode followed a similar track that the 
Cigital to Fortify to HP technology has.

-Chris

-Original Message-
From: sc-l-boun...@securecoding.org [mailto:sc-l-boun...@securecoding.org] On 
Behalf Of Gary McGraw
Sent: Tuesday, October 26, 2010 10:14 AM
To: Secure Code Mailing List
Subject: [SC-L] informIT: Technology transfer

hi sc-l,

From time to time a thread or two has popped up on this list discussing how we 
get software security into the main stream.  One obvious way to do this is 
through technology transfer.  I am particularly proud of the role that Cigital 
has played getting security-focused static analysis out into the main 
stream.  Now that IBM owns Ounce and HP owns Fortify we should see 
significant uptake of the technology worldwide.

My informIT column this month is a case study that follows a technology from 
Cigital Labs, through Kleiner Perkins and Fortify to the mainstream.  As you 
will see, technology transfer is hard and it takes serious time and effort.  In 
the case of code scanning technology, the effort took two companies, millions 
of dollars, serious silicon valley engineering and ten years.

Read all about it here: 
http://www.informit.com/articles/article.aspx?p=1648912

Your comments and feedback are welcome.

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?

2010-04-28 Thread Chris Wysopal

There is no reason the php.ini and other framework or app server configuration 
files can't be taken into account in a static analysis.  Veracode performs 
static analysis of an application in its final executable form.  So for 
compiled languages that is a binary executable, for managed languages it is the 
compiled bytecode and for interpreted languages it is the source code.  If 
there are standard configuration files that are part of the run time 
environment for frameworks or app servers they are be considered part of the 
final executable version of the app. If the configuration files are missing 
then a worst case analysis is performed.

Of course the php.ini submitted to the static analyzer might not match the one 
running in production but you can have the same issue doing dynamic testing on 
a staging environment. 

It can be a goal for both static and dynamic testing to have the analysis come 
as close as possible to what will be in production.  

-Chris


-Original Message-
From: SneakySimian [mailto:sneaky.sim...@gmail.com] 
Sent: Wednesday, April 28, 2010 1:10 AM
To: Andre Gironda
Cc: websecurity; Secure Coding; Adam Muntner; Arian J. Evans
Subject: Re: [WEB SECURITY] Re: [SC-L] What do you like better Web penetration 
testing or static code analysis?

I couldn't let this one go.

Having done both source code analysis and blackbox testing, I see
merits in both. The failure that was the Debian SSL bug is a prime
example of why I prefer blackbox testing. That's not to say things
can't go wrong in blackbox testing, because they do, but not all code
behaves the same way in the same environment, so if you actually test
it in the environment it is running in, you can then understand why
the code behaves the way it does. Oversimplified example:

?php
$file = $_GET['file'];

if(file_exists($file))
{
 echo $file;
}

else
{
echo 'File not found. :(';
}

Ignoring the other blatant issues with that code snippet, is that
vulnerable to XSS? No? Are you sure? Yes? Can you prove it? As it
turns out, it depends on a configuration setting in php.ini. The only
real way to know if it is an issue is to run it in the environment it
is meant to be run in. Now, that's not to say that the developer who
wrote that code shouldn't be told to fix it in a source code analysis,
but the point is, some issues are wholly dependent on the environment
and may or may not get caught during code analysis. Other issues such
as code branches that don't execute or do execute in certain
environments can be problematic to spot during normal source code
analysis.

That all said, I do enjoy reading code, especially comment coding from
other developers. :P



On Tue, Apr 27, 2010 at 2:29 PM, Andre Gironda and...@gmail.com wrote:
 On Tue, Apr 27, 2010 at 4:08 PM, Arian J. Evans
 arian.ev...@anachronic.com wrote:
 I think everyone would agree that you definitely want to apply
 additional (deeper?) degrees of analysis and defensive
 compensating-control to high-value and high-risk assets. The tough
 question is what ruler you use to justify degree of security
 investment to degree of potential Risk/Loss.

 That requires information sharing and trend analysis, something that
 our classic vulnerability management programs have also not solved

 
 Join us on IRC: irc.freenode.net #webappsec

 Have a question? Search The Web Security Mailing List Archives:
 http://www.webappsec.org/lists/websecurity/archive/

 Subscribe via RSS:
 http://www.webappsec.org/rss/websecurity.rss [RSS Feed]

 Join WASC on LinkedIn
 http://www.linkedin.com/e/gis/83336/4B20E4374DBA




Join us on IRC: irc.freenode.net #webappsec

Have a question? Search The Web Security Mailing List Archives: 
http://www.webappsec.org/lists/websecurity/archive/

Subscribe via RSS: 
http://www.webappsec.org/rss/websecurity.rss [RSS Feed]

Join WASC on LinkedIn
http://www.linkedin.com/e/gis/83336/4B20E4374DBA


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] What do you like better Web penetration testing or static code analysis?

2010-04-23 Thread Chris Wysopal

Most software security people that I talk to that advocate static analysis and 
pen testing see it as one part of the overall solution.  It is a part of the 
solution that software producers can get started on rather easily to open their 
eyes that they need secure architectures and better development practices.  

The biggest problem I face when dealing with our customers is the developers 
already think they have written secure code.  It is only after you demonstrate 
on their own code that they have exploitable vulnerabilities will anything be 
done to remedy the situation.  This is why static analysis and pen testing are 
an important part of driving software security to the masses of developers.

-Chris


-Original Message-
From: sc-l-boun...@securecoding.org [mailto:sc-l-boun...@securecoding.org] On 
Behalf Of Gary McGraw
Sent: Thursday, April 22, 2010 3:15 PM
To: Peter Neumann; Secure Code Mailing List
Subject: Re: [SC-L] What do you like better Web penetration testing or static 
code analysis?

I hereby resonate with my esteemed colleague and mentor pgn.  But no puns from 
me.

gem


On 4/22/10 1:57 PM, Peter Neumann neum...@csl.sri.com wrote:



Matt Parsons wrote:
 What do you like doing better as application security professionals, web
 penetration testing or static code analysis?

McGovern, James F. (P+C Technology) wrote:
 Should a security professional have a preference when both have
 different value propositions? While there is overlap, a static analysis
 tool can find things that pen testing tools cannot. Likewise, a pen test
 can report on secure applications deployed insecurely which is not
 visible to static analysis.

 So, the best answer is I prefer both...

Both is better than either one by itself, but I think Gary McGraw
would resonate with my seemingly contrary answer:

  BOTH penetration testing AND static code analysis are still looking
  at the WRONG END of the horse AFTER it has left the DEVELOPMENT BARN.
  Gary and I and many others have for a very long time been advocated
  security architectures and development practices that greatly enhance
  INHERENT TRUSTWORTHINESS, long before anyone has to even think about
  penetration testing and static code analysis.

  This discussion is somewhat akin to arguments about who has the best
  malware detection.  If system developers (past-Multics) had paid any
  attention to system architectures and sound system development
  practices, viruses and worms would be mostly a nonproblem!

  Please pardon my soapbox.

The past survives.
The archives
have lives,
not knives.
High fives!

(I strive
to thrive
with jive.)

PGN
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] web apps are homogenous?

2010-02-26 Thread Chris Wysopal

A large part of the cost of fixing a bug, especially late in the dev cycle 
after testing is complete, is the cost of regression testing.  The cost of 
regression testing of a patch for commercial software is much higher than the 
cost of a custom web application.  Think of an Oracle bug that spans 5 
supported product revisions over 5 platforms.  That is 25 separate builds that 
need to be regression tested. Plus regression testing for commercial software 
needs to be more extensive because many different deployment scenarios need to 
be incorporated.  Mary Ann Davidson told me this could cost up to $1M in a 
worst case scenario. A bug in a custom enterprise web application may need to 
be fixed quicker do to exposure which may raise the cost slightly but this is 
nothing compared to the testing effort to validate the fix works and did not 
break anything. The cost of fixing a bug late in the dev cycle or once the 
software is deployed is much higher for commercial software than it is !
 for a single instance web application.  The cost scales with number of 
supported revisions effected and the size and complexity of the installed base.

-Chris

-Original Message-
From: sc-l-boun...@securecoding.org [mailto:sc-l-boun...@securecoding.org] On 
Behalf Of Benjamin Tomhave
Sent: Thursday, February 25, 2010 6:43 AM
To: Jon McClintock
Cc: SC-L@securecoding.org
Subject: Re: [SC-L] web apps are homogenous?

Jon,

I think you're getting out of the scope of the costing exercise. The
research and estimates around time to fix are based on the cost
associated with developing the patch, not with deploying it. One could
argue that the cost of fixing bugs - particularly major ones - is much
higher for web applications given that they are more likely to be
rapidly deployed and that the discovery of the bug is more likely to be
widely publicized (especially if it leads to a breach). Everybody has a
reasonable expectation that widely deployed commercial software is going
to have various bugs over its life (e.g. Windows, Adobe products), while
people seem to still be generally surprised when holes pop-up in web apps.

Now, that being said, it is still a valid question as to if there is a
cost differential between fix classic compiled code and modern web code.
Toward that end, I would recommend looking into Laurie Williams' work at
NCSU. She has inherited John Musa's Software Reliability Engineering
legacy, is active in the field, and has published a number of articles
and papers potentially relevant to this field. See:
http://collaboration.csc.ncsu.edu/laurie/

fwiw.

-ben

On 2/25/10 1:56 AM, Jon McClintock wrote:
 On Wed, Feb 24, 2010 at 10:46:56AM -0500, Paco Hope wrote:
 I don't think webness conveys any more homogeneity than, say
 windowsness or linuxness.
 
 What part of being a web application provides homogeneity in a way
 that makes patching cheaper?
 
 In a word, control. Let's compare two different organizations: a 
 commercial software development company, and a web commerce company. 
 They both develop software, but how the software is deployed and
 managed is widely different.
 
 Commercial software is created by one party, and consumed by
 multiple other parties. Those parties may run it in widely different
 operating environments, with different network, software and harware 
 configurations. They may be running old versions of the software, or 
 using it in novel ways.
 
 If the commercial software development company has to patch a 
 vulnerability, they need to first determine which releases of the 
 software need to be patched, develop and test a patch for each
 supported version, test it across the plethora different
 configurations their customers may be running, develop release notes
 and a security advisory, make the patch available, and support their
 customers while they are patching.
 
 For a web commerce company, however, the picture is entirely
 different. While their production fleet may comprise hundreds, or
 even thousands, of servers, they're likely all running the exact same
 software and configuration, using a configuration management system
 to deploy the website software and keep it in sync.
 
 If the web commerce company identifies a vulnerability in their
 website, they can debug the running stack, create a fix, test it
 against an exact replica of the production stack, and use automated
 tools to deploy the patch to their entire fleet in one operation.
 
 -Jon
 
 
 
 ___ Secure Coding mailing
 list (SC-L) SC-L@securecoding.org List information, subscriptions,
 etc - http://krvw.com/mailman/listinfo/sc-l List charter available at
 - http://www.securecoding.org/list/charter.php SC-L is hosted and
 moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free,
 non-commercial service to the software security community. 
 ___

-- 
Benjamin Tomhave, MS, CISSP
tomh...@secureconsulting.net

Re: [SC-L] Genotypes and Phenotypes (Gunnar Peterson)

2009-10-15 Thread Chris Wysopal

This seems to boil down to an economics problem.  Notice how quickly the bean 
counters showed up after the thread began with a discussion of bugs and 
complexity.  It is just too inexpensive to create new code and there isn't 
enough economic pain when it fails for anything to change for most software.  
In certain cases like aircraft where the economic pain of failure is high you 
get DO-178B, Software Considerations in Airborne Systems and Equipment 
Certification.  For that type of software you might see the purchase of highly 
reliable libraries that have also met that certification.

-Chris

From: sc-l-boun...@securecoding.org [mailto:sc-l-boun...@securecoding.org] On 
Behalf Of Andreas Saurwein Franci Gonçalves
Sent: Wednesday, October 14, 2009 9:49 AM
To: Secure Coding List
Subject: Re: [SC-L] Genotypes and Phenotypes (Gunnar Peterson)

2009/10/14 SC-L Reader Dave Aronson 
securecoding2d...@davearonson.commailto:securecoding2d...@davearonson.com
Andreas Saurwein Franci Gonçalves 
saurw...@gmail.commailto:saurw...@gmail.com wrote
(rearranged into  correct order):

 2009/10/13 Bobby Miller b.g.mil...@gmail.commailto:b.g.mil...@gmail.com

 The obvious difference is parts.  In manufacturing, things are assembled
 from well-known, well-specified, tested parts.  Hmmm

 Thats the idea of libraries. Well known, well specified, well tested parts.
 Well, whatever.
Ideally, yes.  However, programmers love to reinvent the wheel.  It's
MUCH easier, both to do and to get away with, in software than in
hardware... and often necessary.

Need a bolt of at least a given length and strength, less than a given
diameter?  There are standard thread sizes, and people make bolts of
most common threadings and lengths, for purchase at reasonable prices,
at places easily found, and you can be fairly certain that any given
one of them will do the job quite well.

Need a function for your program?  If it's as common as a bolt, it's
probably already built into the very language.  If it's nearly as
common, maybe there's a fairly standard library for it... and if
you're very lucky, it's not too buggy or brittle.  Otherwise, it's
probably going to be much cheaper (which is all your management
probably cares about) to just code the damn thing yourself, than to
research who makes such a thing, which ones there are, who says which
one is how reliable, which ones have licensing terms your company
finds palatable, and justifying your choice to management.  Lord help
you if it requires money, because then you have to justify it to a
higher degree, get the beancounters involved, budgetary authority from
possibly multiple layers of manglement, and spend the rest of your
days filling out purchase orders.

If you do wind up coding it yourself, is the company then going to
make that piece of functionality available to the world separately,
whether for profit or open source?  N times out of N+1, for very large
values of N, no way!

Will they at least make it available *internally*, so that *they*
don't have to reinvent the wheel *next* time?  Again, N times out of
N+1, for almost as large values of N, no.

-Dave

Exactly thats the point. Going a bit further, for every piece of  hardware 
engineering, there is almost always a legal, worldwide or at least national 
standard to follow. This is inexistent in software.

As long as anybody with at least one healthy finger is allowed to write and 
sell software, the current situation will not change.

Make software development an engineering discipline with all the rights and 
obligations of other engineering sciences.

No more coding without a license. Point. This would change the landscape of 
bits and bytes in a dramatic way. But it requires the support of the 
governments worldwide.

My 2 cents (me too would have to get back to college and study some more, 
although having 25+ years of software development experience)

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Provably correct microkernel (seL4)

2009-10-03 Thread Chris Wysopal

And presumably before they spent many man years proving implementation 
correctness they could have spent a fraction of that on design review and 
subsequent design corrections.

-Chris

-Original Message-
From: sc-l-boun...@securecoding.org [mailto:sc-l-boun...@securecoding.org] On 
Behalf Of Gunnar Peterson
Sent: Friday, October 02, 2009 3:21 PM
To: Cassidy, Colin (GE Infra, Energy)
Cc: Secure Code Mailing List
Subject: Re: [SC-L] Provably correct microkernel (seL4)


 design flaws.  So we have only removed 50% of the problem.

for my part there have been many, many days when I would settle for  
solving 50% of a problem

-gunnar
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] InternetNews Realtime IT News - Merchants Cope With PCICompliance

2008-06-30 Thread Chris Wysopal

Ken,

Customers not wanting to part with source code is one of the reasons, at
Veracode, we decided to take our static binary analysis technology to
market as SaaS. You get the benefit of both automation, as with static
source code analysis, and an external assessment, yet you don't have to
part with your source code.  So that we can deliver the same analysis
accuracy as source code static analysis (among other reasons) we require
our customers to submit symbols along with the compiled binaries.  It is
true that there is some intellectual property included in the symbols
but it doesn't elicit the same level of protective response which has
people opting for the root canal over sending source code externally.
Our solution allows organizations to meet the external code review
requirements without having external parties inspect their source code.

-Chris

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Kenneth Van Wyk
Sent: Monday, June 30, 2008 9:44 AM
To: Secure Coding
Subject: [SC-L] InternetNews Realtime IT News - Merchants Cope With
PCICompliance

Happy PCI-DSS 6.6 day, everyone.  (Wow, that's a sentence you don't hear
often.)

http://www.internetnews.com/ec-news/article.php/3755916

In talking with my customers over the past several months, I always find
it interesting that the vast majority would sooner have root  
canal than submit their source code to anyone for external review.   
I'm betting PCI 6.6 has been a boon for the web application firewall
(WAF) world.


Cheers,

Ken

-
Kenneth R. van Wyk
SC-L Moderator
KRvW Associates, LLC
http://www.KRvW.com




___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Security Testing track: Software Testing Conference:Washington DC

2007-09-06 Thread Chris Wysopal

There has been some movement in this direction and I think you are
correct that that we need to educate the mainstream QA audience just as
we must educate the mainstream developer audience.  I am giving a
keynote on software security testing at Practical Quality and Software
Testing in Minneapolis next week: http://www.psqtconference.com/. I am
also speaking at STPCon on prioritizing security testing.  There are
also speakers from SPI Dynamics and Ounce Labs at that conference.  If
you know of other QA conferences please post them here as I am
interested at speaking to this audience and I have found them bery
receptive to security testing topics.

Another educational approach is to target this community when we write
books and magazine articles on software security. One of the goals of my
book, The Art of Software Security Testing was to bring the concepts
of security testing to a traditional QA audience.  To that end I teamed
up with Elfriede Dustin, an author of several QA books, and an organizer
of the Verify conference to make sure the book spoke to the right
audience.

I know Joseph Feiman at Gartner has software security testing as a focus
area.  He has written a few research notes on the topic.

-Chris

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of McGovern, James F
(HTSC, IT)
Sent: Tuesday, August 28, 2007 10:39 AM
To: sc-l@securecoding.org
Subject: Re: [SC-L] Security Testing track: Software Testing
Conference:Washington DC

 Upon reading this, I had several thoughts come to mind:

1. If we are to truly solve the last mile, we need to also choose more
mainstream conferences such as STPCon (http://www.stpcon.com) since they
also have an associated magazine (Software Test and Performance) which
may stimulate more magazine articles on the topic. I did a quick run
upstairs to our QA folks and asked them what magazines do they read as
well as awareness of certain conferences.

2. What do you think we can do as a unified group of individuals in
terms of a listserv to encourage various industry analyst firms such as
Gartner, Forrester and The Burton Group to talk about Secure Software
Testing as a research area? Many CIOs and other IT executives put lots
of value into what they say. We need more top down.

3. What would it take to get more speaker diversity? We have to figure
out how to get more end-customers telling their own stories vs vendors
and consulting firms

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Paco Hope
Sent: Thursday, August 16, 2007 1:41 PM
To: Secure Coding
Subject: [SC-L] Security Testing track: Software Testing
Conference:Washington DC


Hey folks,

One of my strong beliefs is that we're never going to close the loop on
Building Security In until we get the QA side of the house involved in
security. To that end, I'm co-chairing VERIFY 2007, a software testing
conference where we have a security testing track. (In addition to more
typical QA issues like test automation) I thought some folks on this
list may be interested in attending, or passing it on to your colleagues
in QA organizations.

Conference web site is http://verifyconference.com/ and you can get a
2-page Conference in a Nutshell PDF here:
http://verifyconference.com/images/verify/verify2007.pdf

Please help me spread the word.

Thanks,
Paco
--
Paco Hope, CISSP
Co-Chair, VERIFY 2007
http://verifyconference.com/ * +1.703.606.1905



*
This communication, including attachments, is for the exclusive use of
addressee and may contain proprietary, confidential and/or privileged
information.  If you are not the intended recipient, any use, copying,
disclosure, dissemination or distribution is strictly prohibited.  If
you are not the intended recipient, please notify the sender immediately
by return e-mail, delete this communication and destroy all copies.

*


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org List
information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC
(http://www.KRvW.com) as a free, non-commercial service to the software
security community.
___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Adapting Penetration Testing for Software Development Purposes

2007-01-23 Thread Chris Wysopal

Ken,

I enjoyed reading your this article.  My book The Art of Software
Security Testing  is based on the concept of using penetration techniques
as part of the development lifecycle and is specifically targetted at QA
professionals.  One of my co-authors Elfriede Dustin has written 5 QA
books and assured that the book was accessible to that audience.

There are some free chapters of the book available:


Chapter 3: The Secure Software Development Lifecycle
http://www.devsource.com/article2/0,1895,2055988,00.asp

Charter 4: Risk-Based Security Testing: Prioritizing Security Testing with
Threat Modeling
http://www.prnewswire.com/mnr/veracode/26386/docs/Wysopal_Rev-Chapter%2004.pdf

Chapter 5: Shades of Analysis: White, Gray, and Black Box Testing
http://computerworld.com/action/article.do?command=viewArticleBasictaxonomyName=securityarticleId=9006870taxonomyId=17intsrc=kc_feat

Cheers,

Chris


On Mon, 22 Jan 2007, Kenneth Van Wyk wrote:

 Greetings SC-L folk,

 FYI, there's been a wave of new content added to the DHS-funded
 software security portal, Build Security In (home URL is http://
 BuildSecurityIn.us-cert.gov).  Most recently, a couple of articles
 about penetration testing and tools were added (see
 https://buildsecurityin.us-cert.gov/daisy/bsi/articles/best-practices/
 penetration/655.html?branch=1language=1).

 (Full disclosure: I'm the author of the pen testing articles, but
 don't let that stop you from grabbing them.  ;-)

 All of the articles on the BSI portal are free.

 Cheers,

 Ken
 -
 Kenneth R. van Wyk
 SC-L Moderator
 KRvW Associates, LLC
 http://www.KRvW.com





___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] darkreading: voting machines

2006-10-13 Thread Chris Wysopal


On Mon, 9 Oct 2006, Gary McGraw wrote:

 The most interesting thing from an sc-l perspective about this column is
 that it emphasizes a client need we're often forced to address---the
 need for a demo exploit.  Sometimes those on the receiving end of a
 software security vulnerability don't believe that findings are real.
 An often-repeated excuse for doing nothing is well, that's just a
 theoretical attack and it's too academic to matter.  I can't tell you
 how many times I've heard that refrain.

In 1998 we put a slogan on the L0pht.com web page.

   That vulnerability is theoretical. -Microsoft

   L0pht - making the theoretical practical since 1992.

Microsoft doesn't say that line any more.  I guess a few worms can change
your tune.  It seems that you need to get bitten a few times before you
automatically put on the bug spray before heading down to the swamp.

-Chris
___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


Re: [SC-L] darkreading: voting machines

2006-10-13 Thread Chris Wysopal

I think there is an easy solution to this.  It is called a 3rd party
audit.  This is done all the time in the financial community.  Software
vendors fork over their latest product version and sometimes source code
and a credible 3rd party looks for holes.  It is sometimes paid for by the
customer and sometimes the customer makes it a requirement of purchase so
it is paid for by the vendor.  We did many of these at @stake.

Why do banks do this?  Some say it is because they don't like to go to
jail or be fined for not following the nebulous reasonable care
provisions of the regulations they are required to follow.  It is more
likely they don't want to tarnish their trusted brand image.

A voting commission has neither of these motivations.  No one gets fined
or fired or put in jail for fielding a system that ends up being used for
voter fraud.  There is no trusted image to protect.

The pretend stuff in the lab comment is especially discouraging because
what Rubin and Felten are doing in the lab is EXACTLY what Diebold should
be doing in their testing lab.  A product that has security requirements
is not fit to be fielded until security testing has been performed.

-Chris

-= The Art of Software Security Testing: Identifying Software Security
-= Flaws by Chris Wysopal, Lucas Nelson, Dino Dai Zovi, and Elfriede
-= Dustin, Available Nov 27, 2006

On Tue, 10 Oct 2006, Jeremy Epstein wrote:

 Gary,

 Interesting point.  I'm on the Virginia state commission charged with making
 recommendations around voting systems, and we watched the Princeton video as
 part of our most recent meeting.  The reaction from the election officials
 was amusing and scary: if this is so real, why don't you hack a real
 election instead of this pretend stuff in the lab.  Pointing out that it
 would (most likely) be a felony, and people like Rubin, Felten, and others
 are trying to help security not go to jail didn't seem to impress them.
 Also pointing out that the Rubin  Felten examples used out-of-date code
 because vendors won't share anything up-to-date doesn't seem to impress
 them.  [This in response to Diebold's claim that they were looking at old
 code, and the problems are all fixed.]

 I frankly don't think anything is going to impress the election officials
 (and some of the elected officials) short of incontrovertible evidence of a
 DRE meltdown - and of course, we know that there could well be a failure
 (and may have been failures) that are unproveable thanks to the nature of
 software.

 --Jeremy

 P.S. One of the elected officials on the commision insisted that Felten
 couldn't possibly have done his demo exploit without source code, because
 everyone knows you can't do an exploit without the source.  Unfortunately,
 the level of education that needs to be provided to someone like that is
 more than I can provide in a QA format.  I tried giving as an example that
 around 50% of the Microsoft updates are due to flaws found by people without
 source, but he wouldn't buy it (he was using a Windows laptop, but
 doesn't seem to understand where the fixes come from).

  -Original Message-
  From: [EMAIL PROTECTED]
  [mailto:[EMAIL PROTECTED] On Behalf Of Gary McGraw
  Sent: Monday, October 09, 2006 12:19 PM
  To: SC-L@securecoding.org
  Cc: [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED]
  Subject: [SC-L] darkreading: voting machines
 
  Hi all,
 
  I'm sure that many of you saw the Ed Felten and friends
  break Diebold machines story a couple of weeks ago...maybe
  in DDJ or on /..  I wrote a piece about the crack for
  darkreading, which you can find here:
 
  http://www.darkreading.com/document.asp?doc_id=105188WT.svl=column1_1
 
  The most interesting thing from an sc-l perspective about
  this column is that it emphasizes a client need we're often
  forced to address---the need for a demo exploit.  Sometimes
  those on the receiving end of a software security
  vulnerability don't believe that findings are real.
  An often-repeated excuse for doing nothing is well, that's
  just a theoretical attack and it's too academic to matter.
  I can't tell you how many times I've heard that refrain.
 
  When that happens, building an exploit is often the only
  clear next step.  And yet we all know how expensive and hard
  exploit development is.
 
  In this case, Diebold consistently downplay'ed Avi Rubin's
  results as academic or theoretical.  Ed upped the ante.
  Think it'll work??
 
  gem
 
  company www.cigital.com
  podcast www.cigital.com/silverbullet
  book www.swsec.com
 
 
  --
  --
  This electronic message transmission contains information
  that may be confidential or privileged.  The information
  contained herein is intended solely for the recipient and use
  by any other party is not authorized.  If you are not the
  intended recipient (or otherwise authorized to receive this
  message by the intended recipient), any

RE: [SC-L] Bugs and flaws

2006-02-02 Thread Chris Wysopal

In the manufacturing world, which is far more mature than the software
development world, they use the terminology of design defect and
manufacturing defect.  So this distinction is useful and has stood the
test of time.

Flaw and defect are synonymous. We should just pick one. You could say
that the term for manufacturing software is implementation.

So why do we need to change the terms for the software world?  Wouldn't
design defect and implementation defect be clearer and more in line
with the manufacturing quality discipline, which the software quality
discipline should be working towards emulating. (When do we get to Six
Sigma?)

I just don't see the usefulness of calling a design defect a flaw.
Flaw by itself is overloaded.  And in the software world, bug can mean
an implementation or design problem, so bug alone is overloaded for
describing an implementation defect.

At @stake the Application Center of Excellence used the terminology
design flaw and implementation flaw.  It well understood by our
customers.

As Crispin said in an earlier post on the subject, the line is sometimes
blurry.  I am sure this is the case in manufacturing too.  Architecture
flaws can be folded into the design flaw category for simplicity.

My vote is for a less overloaded and clearer terminology.

-Chris

P.S. My father managed a non-destructive test lab at a jet engine
manufacturer. They had about the highest quality requirements in the
world. So for many hours I was regaled with tales about the benefits of
performing static analysis on individual components early in the
manufacturing cycle.

They would dip cast parts in a fluorescent liquid and look at them under
ultraviolet light to illuminate cracks caused during casting process.
For critical parts which would receive more stress, such as the fan
blades, they would x-ray each part to inspect for internal cracks. A more
expensive process but warranted due to the increased risk of total system
failure for a defect in those parts.

The static testing was obviously much cheaper and delivered better quality
than just bolting the parts together and doing dynamic testing in a test
cell.  It's a wonder that it has taken the software security world so long
to catch onto the benefits of static testing of implementation.  I think
we can learn a lot more from the manufacturing world.

On Thu, 2 Feb 2006, Gary McGraw wrote:

 Hi all,

 When I introduced the bugs and flaws nomenclature into the
 literature, I did so in an article about the software security workshop
 I chaired in 2003 (see http://www.cigital.com/ssw/).  This was
 ultimately written up in an On the Horizon paper published by IEEE
 Security  Privacy.

 Nancy Mead and I queried the SWEBOK and looked around to see if the new
 usage caused collision.  It did not.  The reason I think it is important
 to distinguish the two ends of the rather slippery range (crispy is
 right about that) is that software security as a field is not paying
 enough attention to architecture.  By identifying flaws as a subcategory
 of defects (according the the SWEBOK), we can focus some attention on
 the problem.

 From the small glossary in Software Security (my new book out
 tomorrow):

 Bug-A bug is an implementation-level software problem. Bugs may exist in
 code but never be executed. Though the term bug is applied quite
 generally
 by many software practitioners, I reserve use of the term to encompass
 fairly
 simple implementation errors. Bugs are implementation-level problems
 that
 can be easily discovered and remedied. See Chapter 1.

 Flaw-A design-level or architectural software defect. High-level defects
 cause 50% of software security problems. See Chapter 1.

 In any case, I intend to still use these terms like this, and I would be
 very pleased if you would all join me.

 gem



 
 This electronic message transmission contains information that may be
 confidential or privileged.  The information contained herein is intended
 solely for the recipient and use by any other party is not authorized.  If
 you are not the intended recipient (or otherwise authorized to receive this
 message by the intended recipient), any disclosure, copying, distribution or
 use of the contents of the information is prohibited.  If you have received
 this electronic message transmission in error, please contact the sender by
 reply email and delete all copies of this message.  Cigital, Inc. accepts no
 responsibility for any loss or damage resulting directly or indirectly from
 the use of this email or its contents.
 Thank You.
 

 ___
 Secure Coding mailing list (SC-L)
 SC-L@securecoding.org
 List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
 List charter available at - http://www.securecoding.org/list/charter.php


RE: [SC-L] Bugs and flaws

2006-02-02 Thread Chris Wysopal

In the manufacturing world, manufacturing defects are defects that were
not intended by the design. With software, an implementation defect is a
defect that is not indended by the design.  That is where I see the
analogy. A factory worker forgetting to put on a washer or installing a
polarized capacitor backwards is similar to a programmer neglecting to
check a return code or being off by one in a length calculation.

In both disciplines, to increase quality you could say don't do that,
you could add a quality process that tests for the correct implementation,
or best, you could make it impossible for the mistake to happen. So I
guess I see a lot of similarities between the manufacturing process and
the software implementation process.

Sure its not a perfect analogy.  Nothing seems to be between the physical
and digital worlds.  As you say, many of the flaws created during what is
traditionally known as implementation are low-level design errors but at
the very end of the continuum they are simply mistakes.

-Chris


On Thu, 2 Feb 2006, David Crocker wrote:

 I don't think this analogy between software development and
 manufacturing holds. There are no manufacturing defects in software
 construction, unless one counts a buggy chip (e.g. Pentium FPU or
 similar) or perhaps a buggy compiler. Software instructions execute
 predictably and are not subject to the problems of defective materials,
 difficulties in keeping dimensions within a precise tolerance, or wear
 and tear.

 If some small bolt in my car fails because the bolt met its
 manufacturer's specification but was not strong enough to withstand the
 loads it was subjected to, that is a low-level design error, not a
 manufacturing error. Similarly, I view coding errors as low-level design
 errors.

 David Crocker, Escher Technologies Ltd.
 Consultancy, contracting and tools for dependable software development
 www.eschertech.com



 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
 Behalf Of Chris Wysopal
 Sent: 02 February 2006 21:35
 To: Gary McGraw
 Cc: William Kruse; Wall, Kevin; Secure Coding Mailing List
 Subject: RE: [SC-L] Bugs and flaws



 In the manufacturing world, which is far more mature than the software
 development world, they use the terminology of design defect and
 manufacturing defect.  So this distinction is useful and has stood the test 
 of
 time.

 Flaw and defect are synonymous. We should just pick one. You could say that 
 the
 term for manufacturing software is implementation.

 So why do we need to change the terms for the software world?  Wouldn't 
 design
 defect and implementation defect be clearer and more in line with the
 manufacturing quality discipline, which the software quality discipline should
 be working towards emulating. (When do we get to Six
 Sigma?)

 I just don't see the usefulness of calling a design defect a flaw. Flaw 
 by
 itself is overloaded.  And in the software world, bug can mean an
 implementation or design problem, so bug alone is overloaded for describing 
 an
 implementation defect.

 At @stake the Application Center of Excellence used the terminology design
 flaw and implementation flaw.  It well understood by our customers.

 As Crispin said in an earlier post on the subject, the line is sometimes 
 blurry.
 I am sure this is the case in manufacturing too.  Architecture flaws can be
 folded into the design flaw category for simplicity.

 My vote is for a less overloaded and clearer terminology.

 -Chris

 P.S. My father managed a non-destructive test lab at a jet engine 
 manufacturer.
 They had about the highest quality requirements in the world. So for many 
 hours
 I was regaled with tales about the benefits of performing static analysis on
 individual components early in the manufacturing cycle.

 They would dip cast parts in a fluorescent liquid and look at them under
 ultraviolet light to illuminate cracks caused during casting process. For
 critical parts which would receive more stress, such as the fan blades, they
 would x-ray each part to inspect for internal cracks. A more expensive process
 but warranted due to the increased risk of total system failure for a defect 
 in
 those parts.

 The static testing was obviously much cheaper and delivered better quality 
 than
 just bolting the parts together and doing dynamic testing in a test cell.  
 It's
 a wonder that it has taken the software security world so long to catch onto 
 the
 benefits of static testing of implementation.  I think we can learn a lot more
 from the manufacturing world.

 On Thu, 2 Feb 2006, Gary McGraw wrote:

  Hi all,
 
  When I introduced the bugs and flaws nomenclature into the
  literature, I did so in an article about the software security
  workshop I chaired in 2003 (see http://www.cigital.com/ssw/).  This
  was ultimately written up in an On the Horizon paper published by
  IEEE Security  Privacy.
 
  Nancy Mead and I queried the SWEBOK and looked around to see

RE: [SC-L] Intel turning to hardware for rootkit detection

2005-12-14 Thread Chris Wysopal


On Wed, 14 Dec 2005, ljknews wrote:

 At 9:14 AM -0500 12/14/05, Gary McGraw wrote:
  No, that's a copy of stackguard.  The real problem with all of these
  approaches is that they don't fix the root problem.  Finding and removing
  buffer overflow conditions with a static analysis tool is far superior.

 But still better is using a programming language that does not allow
 buffer overflows.

This isn't a solution to code that is already deployed which is the
problem that stackguard and static analysis address.

-Chris
___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php