Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT)
hi all, my preference is to lead with an Architectural Risk Analysis (and has been since 1997). gem http://www.cigital.com/~gem On 3/20/09 3:07 PM, Jim Manico j...@manico.net wrote: This is why I'm not fond if leading with a tool. I prefer to lead with architectural/design analysis and targeted manual review of high risk applications. Jim Manico j...@manico.net On Mar 20, 2009, at 4:06 AM, Goertzel, Karen [USA] goertzel_ka...@bah.com wrote: Except when they're hardware bugs. :) I think the differentiation is also meaningful in this regard: I can specify software that does non-secure things. I can implement that software 100% correctly. Ipso facto - no software bugs. But the fact remains that the software doesn't validate input because I didn't specify it to validate input, or it doesn't encrypt passwords because I didn't specify it to do so. I built to spec; it just happened to be a stupid spec. So the spec is flawed - but the implemented software conforms to that stupid spec 100%, so by definition it not flawed. It is, however, non-secure. -- Karen Mercedes Goertzel, CISSP Booz Allen Hamilton 703.698.7454 goertzel_ka...@bah.com mailto:goertzel_ka...@bah.com -Original Message- From: sc-l-boun...@securecoding.org on behalf of Benjamin Tomhave Sent: Thu 19-Mar-09 19:28 To: Secure Code Mailing List Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT) Why are we differentiating between software and security bugs? It seems to me that all bugs are software bugs, ... ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org mailto:SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT)
I would argue that the security 'bugs' you've described are in fact functional deficiencies in the implemented design. That is, the exploit of them has a direct impact on functional performance of the application, even if it's just a problem with error handling (input validation). I would further argue that treating security as a special case ends up doing us more harm than good. Doing so allows developers, designers, and the business to shrug it off as Somebody Else's Problem (SEP), instead of owning it themselves. The same goes for the requirements, design, etc. As an industry, we've developed segments of specialized knowledge, and then have the audacity to complain about it not being mainstream. It's time we picked one, and I would argue that mainstreaming these concepts will be far more effective than continuing as a specialized bolt-on discipline (which is not to say that specialized research should not occur, just that in real life the application of this knowledge should not be specialized, per se). *shrug* The only way I see to win the game is to change the rules and/or the game play itself. We must never forget that the security industry still relies (heavily) on many of the same concepts that protected us 15 years ago (i.e. signature-based scans and ACLs - AV+firewall). -ben Goertzel, Karen [USA] wrote: No - that isn't really what I meant. There CAN be security bugs - i.e., implementation errors with direct security implications, such as a divide-by-zero error that allows a denial of service in a security-critical component, thus exposing what is supposed to be protected data. But there are also bad security decisions - these can be at the requirements spec or design spec level. If they're at the requirements spec level, they aren't bugs - they are either omissions of good security or commissions of bad security. An omission of good security is not encrypting a password. That isn't a bug per se - unless it's a violation of policy. But if there's no password encryption policy, then the failure to include a requirement to encrypt passwords would not be a bug or a violation of any sort (except a violation of common sense). It would still, however, result in poor security. -- Karen Mercedes Goertzel, CISSP Booz Allen Hamilton 703.698.7454 goertzel_ka...@bah.com -Original Message- From: Benjamin Tomhave [mailto:list-s...@secureconsulting.net] Sent: Fri 20-Mar-09 11:04 To: Goertzel, Karen [USA] Cc: Secure Code Mailing List Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT) So, what you're saying is that security bugs are really design flaws, assuming a perfect implementation of the design. Ergo, security bug is at best a misnomer, and at worst a fatal deficiency in design acumen. :) -ben Goertzel, Karen [USA] wrote: Except when they're hardware bugs. :) I think the differentiation is also meaningful in this regard: I can specify software that does non-secure things. I can implement that software 100% correctly. Ipso facto - no software bugs. But the fact remains that the software doesn't validate input because I didn't specify it to validate input, or it doesn't encrypt passwords because I didn't specify it to do so. I built to spec; it just happened to be a stupid spec. So the spec is flawed - but the implemented software conforms to that stupid spec 100%, so by definition it not flawed. It is, however, non-secure. -- Karen Mercedes Goertzel, CISSP Booz Allen Hamilton 703.698.7454 goertzel_ka...@bah.com -Original Message- From: sc-l-boun...@securecoding.org on behalf of Benjamin Tomhave Sent: Thu 19-Mar-09 19:28 To: Secure Code Mailing List Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT) Why are we differentiating between software and security bugs? It seems to me that all bugs are software bugs, ... -- Benjamin Tomhave, MS, CISSP fal...@secureconsulting.net LI: http://www.linkedin.com/in/btomhave Blog: http://www.secureconsulting.net/ Photos: http://photos.secureconsulting.net/ Web: http://falcon.secureconsulting.net/ [ Random Quote: ] Hofstadter's Law: A task always takes longer than you expect, even when you take into account Hofstadter's Law. http://globalnerdy.com/2007/07/18/laws-of-software-development/ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
I have to post this blog in response. http://labs.mudynamics.com/2008/07/14/zen-and-the-art-of-fixing-p1-bugs Love the security testing IS functional testing, BTW. K. --- http://www.pcapr.net On Thu, Mar 19, 2009 at 4:28 PM, Benjamin Tomhave list-s...@secureconsulting.net wrote: Why are we differentiating between software and security bugs? It seems to me that all bugs are software bugs, and how quickly they're tackled is a matter of prioritizing the work based on severity, impact, and ease of resolution. It seems to me that, while it is problematic that security testing has been excluded historically, our goal should not be to establish yet another security-as-bolt-on state, but rather leapfrog to the desired end-state where QA testing includes security testing as well as functional testing. In fact, one could even argue that security testing IS functional testing, but anyway... If you're going to innovate, you must as well jump the curve*. -ben * see Kawasaki Art of Innovation http://blog.guykawasaki.com/2007/06/art_of_innovati.html Gary McGraw wrote: Aloha Jim, I agree that security bugs should not necessarily take precedence over other bugs. Most of the initiatives that we observed cycled ALL security bugs into the standard bug tracking system (most of which rank bugs by some kind of severity rating). Many initiatives put more weight on security bugs...note the term weight not drop everything and run around only working on security. See the CMVM practice activities for more. The BSIMM helps to measure and then evolve a software security initiative. The top N security bugs activity is one of an arsenal of tools built and used by the SSG to strategically guide the rest of their software security initiative. Making this a top N bugs of any kind list might make sense for some organizations, but is not something we would likely observe by studying the SSG and the software security initiative. Perhaps we suffer from the looking for the keys under the streetlight problem. gem On 3/19/09 2:31 PM, Jim Manico j...@manico.net wrote: The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. This sounds very problematic to me. There are many standard software bugs that are much more critical to the enterprise than just security bugs. It seems foolhardy to do risk assessment on security bugs only - I think we need to bring the worlds of software development and security analysis together more. Divided we (continue to) fail. And Gary, this is not a critique of just your comment, but of WebAppSec at large. - Jim - Original Message - From: Gary McGraw g...@cigital.com To: Steven M. Christey co...@linus.mitre.org Cc: Sammy Migues smig...@cigital.com; Michael Cohen mco...@cigital.com; Dustin Sullivan dustin.sulli...@informit.com; Secure Code Mailing List SC-L@securecoding.org Sent: Thursday, March 19, 2009 2:50 AM Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT) Hi Steve, Sorry for falling off the thread last night. Waiting for the posts to clear was not a great idea. The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. You are correct to point out that the Architecture Analysis practice has other activities meant to ferret out those sorts of flaws. I asked my guys to work on a flaws collection a while ago, but I have not seen anything yet. Canuck? There is an important difference between your CVE data which is based on externally observed bugs (imposed on vendors by security types mostly) and internal bug data determined using static analysis or internal testing. I would be very interested to know whether Microsoft and the CVE consider the same bug #1 on internal versus external rating systems. The difference is in the term reported for versus discovered internally during SDL activity. gem http://www.cigital.com/~gem On 3/18/09 6:14 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. Interesting. Does this mean that their top N lists are less likely to include design flaws? (though they would be covered under various other BSIMM activities). After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT)
So, what you're saying is that security bugs are really design flaws, assuming a perfect implementation of the design. Ergo, security bug is at best a misnomer, and at worst a fatal deficiency in design acumen. :) -ben Goertzel, Karen [USA] wrote: Except when they're hardware bugs. :) I think the differentiation is also meaningful in this regard: I can specify software that does non-secure things. I can implement that software 100% correctly. Ipso facto - no software bugs. But the fact remains that the software doesn't validate input because I didn't specify it to validate input, or it doesn't encrypt passwords because I didn't specify it to do so. I built to spec; it just happened to be a stupid spec. So the spec is flawed - but the implemented software conforms to that stupid spec 100%, so by definition it not flawed. It is, however, non-secure. -- Karen Mercedes Goertzel, CISSP Booz Allen Hamilton 703.698.7454 goertzel_ka...@bah.com -Original Message- From: sc-l-boun...@securecoding.org on behalf of Benjamin Tomhave Sent: Thu 19-Mar-09 19:28 To: Secure Code Mailing List Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT) Why are we differentiating between software and security bugs? It seems to me that all bugs are software bugs, ... -- Benjamin Tomhave, MS, CISSP fal...@secureconsulting.net LI: http://www.linkedin.com/in/btomhave Blog: http://www.secureconsulting.net/ Photos: http://photos.secureconsulting.net/ Web: http://falcon.secureconsulting.net/ [ Random Quote: ] Hartree's Law: Whatever the state of a project, the time a project-leader will estimate for completion is constant. http://globalnerdy.com/2007/07/18/laws-of-software-development/ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
On Mar 18, 2009, at 23:14, Steven M. Christey wrote: I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux vendors or most any product with a long history, and their current number 1's are not the same as the number 1's of the past. I am trying to get funding for a study that would address precisely this issue. Here is a write-up that I made for the Master students here at the University of Trento that explains in more detail what I'm trying to do; perhaps someone on this list is interested in collaborating: http://www.disi.unitn.it/~neuhaus/proposals/Security-Trends.pdf Best, Stephan ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Steve, You saw my talk at the OWASP assurance day. There was a brief diversion about the number of business logic problems and design flaws (coarsely lumped together in my chart). That 'weight' should indicate that-at least in the subset of clients I deal with-flaws aren't getting short-shrift. http://www.owasp.org/images/9/9e/Maturing_Assessment_through_SA.ppt (for those who didn't see it) You may also want to look at my OWASP NoVA chapter presentation on why we believe Top N lists are bad... It's not so much a rant as it is a set of limitations in ONLY taking at Top N approach, and a set of constructive steps forward to improve one's practices: http://www.owasp.org/images/d/df/Moving_Beyond_Top_N_Lists.ppt.zip I cover how one should cause their own organization-specific Top N list to emerge and how to manage it once it does. John Steven Senior Director; Advanced Technology Consulting Direct: (703) 404-5726 Cell: (703) 727-4034 Key fingerprint = 4772 F7F3 1019 4668 62AD 94B0 AE7F EEF4 62D5 F908 Blog: http://www.cigital.com/justiceleague Papers: http://www.cigital.com/papers/jsteven http://www.cigital.com Software Confidence. Achieved. On 3/18/09 6:14 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. Interesting. Does this mean that their top N lists are less likely to include design flaws? (though they would be covered under various other BSIMM activities). After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux vendors or most any product with a long history, and their current number 1's are not the same as the number 1's of the past. ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist(informIT)
Hi Kevin, Any discipline with the word science in its name probably isn't. I have a dual PhD in two of those fields (computer science and cognitive science), so I ought to know. I mostly agree with your assessment of many industry coders and IT people (most of whom do NOT have a background in computer science). When someone is asked to whip up a solution to an NP-Hard problem and doesn't know not to work on that (or to approach it heuristically) , we have some real issues. There are a boatload of developers who have no computer science theory under their belts, and that is a real problem. And don't even get me started on security people! Fortunately all is not lost and there are many great people sharing knowledge as widely as possible too. I can assure you that during my term as one of the Governors of the Computer Society (the largest IEEE society), we spent plenty of cycles fretting about how to reverse the trend you noted. Not much progress was made. Note that Silver Bullet often interviews scientists, and is co-sponsored by IEEE Security Privacy magazine. I am optimistic that we can keep things on an even scientific footing in software security if we proceed carefully and don't jump on shiny bandwagons as they careen over the cliff. The time for science is upon us. gem http://www.cigital.com/~gem On 3/18/09 6:14 PM, Wall, Kevin kevin.w...@qwest.com wrote: Gary McGraw wrote: We had a great time writing this one. Here is my favorite paragraph (in the science versus alchemy vein): Both early phases of software security made use of any sort of argument or 'evidence' to bolster the software security message, and that was fine given the starting point. We had lots of examples, plenty of good intuition, and the best of intentions. But now the time has come to put away the bug parade boogeyman, the top 25 tea leaves, black box web app goat sacrifice, and the occult reading of pen testing entrails. The time for science is upon us. I might agree with your quote of The time for science is upon us. if it were not for the fact that the rest of computer science / engineeering is far ahead of computer security (IMO), and they are *still* not anywhere near real science, at least as practiced as a whole. (There probably are pockets here and there.) For the most part, based on what I see in industry, I'm not even sure we have reached the alchemy stage! (Compare where most organizations are still at with respect to SEI's CMM. The average is probably Level 2. Most organizations no longer even think of CMM as relevant.) My observation is that very few people in the IT profession--outside of academia at least--belong to neither ACM or IEEE-CS or any other professional organization that might challenge them. I question, on a professional level, how much we are going to progress as an industry when most in this profession seem to think that they do not need anything beyond the Learn X in 24 Hours type pablum. (Those are fine as far as they go, but if you think that's all that's required to make you proficient in X, you have surely missed the boat.) Please note, however, that I do not think this mentality is limited to those in the IT / CS professions. Rather, it is a pandemic of this age. Anyhow, I'll shut up now, since this will surely take us OT if I persist. -kevin --- Kevin W. Wall Qwest Information Technology, Inc. kevin.w...@qwest.comPhone: 614.215.4788 It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration - Edsger Dijkstra, How do we tell truths that matter? http://www.cs.utexas.edu/~EWD/transcriptions/EWD04xx/EWD498.html ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Hi Steve, Sorry for falling off the thread last night. Waiting for the posts to clear was not a great idea. The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. You are correct to point out that the Architecture Analysis practice has other activities meant to ferret out those sorts of flaws. I asked my guys to work on a flaws collection a while ago, but I have not seen anything yet. Canuck? There is an important difference between your CVE data which is based on externally observed bugs (imposed on vendors by security types mostly) and internal bug data determined using static analysis or internal testing. I would be very interested to know whether Microsoft and the CVE consider the same bug #1 on internal versus external rating systems. The difference is in the term reported for versus discovered internally during SDL activity. gem http://www.cigital.com/~gem On 3/18/09 6:14 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. Interesting. Does this mean that their top N lists are less likely to include design flaws? (though they would be covered under various other BSIMM activities). After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux vendors or most any product with a long history, and their current number 1's are not the same as the number 1's of the past. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Hi Gary, On Mar 19, 2009, at 16:27, Gary McGraw wrote: Hi Stephan, In my view, it would be even better to study the difference in external bug emphasis (as driven by full disclosure and the CVE) and internal bug emphasis (as driven by an organization's own top N list). That is a brilliant idea, but how do I get internal bug emphasis? The companies in question won't hand over their data just like that. Perhaps a little prodding from someone who is well known and trusted could help here, Mr McGraw, Sir. :-) (Actually, I might get at Microsoft data, if I can make the right pitch.) To put a slightly finer point on it, I wonder whether the scatter you can observe outside of the black box looks completely different than the in-the-box view. In this case, an organizations codebase and dev shop is the box and the external bug reports are outside. I have a feeling that is it. Oh that's a very interesting question. As I said, it's a brilliant idea, and I'd love to see this carried out. Trento has a special place in my heart as I lived there from 8/93-8/94 and worked at IRST. That is very cool! Also, you are lucky that you worked at IRST then, because the CS department is constructing a new building that will completely ruin the view across the valley from IRST. I don't think they like us much over there :-) Say hi to Cognola for me. Will do, even though I live in Povo myself.[1] Fun, Stephan [1] I was told by one of the professors that before the University came here, Povo was the place where the weird mountain people live. That would hold double for the people who live across the Fersina, for example in Cognola :-) ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. This sounds very problematic to me. There are many standard software bugs that are much more critical to the enterprise than just security bugs. It seems foolhardy to do risk assessment on security bugs only - I think we need to bring the worlds of software development and security analysis together more. Divided we (continue to) fail. And Gary, this is not a critique of just your comment, but of WebAppSec at large. - Jim - Original Message - From: Gary McGraw g...@cigital.com To: Steven M. Christey co...@linus.mitre.org Cc: Sammy Migues smig...@cigital.com; Michael Cohen mco...@cigital.com; Dustin Sullivan dustin.sulli...@informit.com; Secure Code Mailing List SC-L@securecoding.org Sent: Thursday, March 19, 2009 2:50 AM Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT) Hi Steve, Sorry for falling off the thread last night. Waiting for the posts to clear was not a great idea. The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. You are correct to point out that the Architecture Analysis practice has other activities meant to ferret out those sorts of flaws. I asked my guys to work on a flaws collection a while ago, but I have not seen anything yet. Canuck? There is an important difference between your CVE data which is based on externally observed bugs (imposed on vendors by security types mostly) and internal bug data determined using static analysis or internal testing. I would be very interested to know whether Microsoft and the CVE consider the same bug #1 on internal versus external rating systems. The difference is in the term reported for versus discovered internally during SDL activity. gem http://www.cigital.com/~gem On 3/18/09 6:14 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. Interesting. Does this mean that their top N lists are less likely to include design flaws? (though they would be covered under various other BSIMM activities). After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux vendors or most any product with a long history, and their current number 1's are not the same as the number 1's of the past. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Actually no. See: http://www.cigital.com/papers/download/j15bsi.pdf (John Steven, State of Application Assessment, IEEE SP) I am not a tool guy, I am a software security guy. gem http://www.cigital.com/~gem On 3/19/09 2:58 PM, Jim Manico j...@manico.net wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. You mean a real list of what a certain vendors static analysis tools find. If you think that list really measures the risk of an organizations software security posture - that might ne considered to be insane! =) - Jim - Original Message - From: Gary McGraw g...@cigital.com To: Steven M. Christey co...@linus.mitre.org Cc: Sammy Migues smig...@cigital.com; Dustin Sullivan dustin.sulli...@informit.com; Secure Code Mailing List SC-L@securecoding.org Sent: Wednesday, March 18, 2009 11:54 AM Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT) Hi Steve, Many of the top N lists we encountered were developed through the consistent use of static analysis tools. After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. Other times (like say in the one case where the study participant did not believe in static analysis for religious reasons) things are a bit more flip (and thus suffer from the no data problem I like to complain about). I do not recall a case when the top N lists were driven by customers. Sorry I missed your talk at the SWA forum. I'll chalk that one up to NoVa traffic. gem http://www.cigital.com/~gem On 3/18/09 5:47 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Because it is about building a top N list FOR A PARTICULAR ORGANIZATION. You and I have discussed this many times. The generic top 25 is unlikely to apply to any particular organization. The notion of using that as a driver for software purchasing is insane. On the other hand if organization X knows what THEIR top 10 bugs are, that has real value. Got it, thanks. I guessed as much. Did you investigate whether the developers' personal top-N lists were consistent with what their customers cared about? How did the developers go about selecting them? By the way, last week in my OWASP Software Assurance Day talk on the Top 25, I had a slide on the role of top-N lists in BSIMM, where I attempted to say basically the same thing. This was after various slides that tried to emphasize how the current Top 25 is both incomplete and not necessarily fully relevant to a particular organization's needs. So while the message may have been diluted during initial publication, it's being refined somewhat. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Aloha Jim, I agree that security bugs should not necessarily take precedence over other bugs. Most of the initiatives that we observed cycled ALL security bugs into the standard bug tracking system (most of which rank bugs by some kind of severity rating). Many initiatives put more weight on security bugs...note the term weight not drop everything and run around only working on security. See the CMVM practice activities for more. The BSIMM helps to measure and then evolve a software security initiative. The top N security bugs activity is one of an arsenal of tools built and used by the SSG to strategically guide the rest of their software security initiative. Making this a top N bugs of any kind list might make sense for some organizations, but is not something we would likely observe by studying the SSG and the software security initiative. Perhaps we suffer from the looking for the keys under the streetlight problem. gem On 3/19/09 2:31 PM, Jim Manico j...@manico.net wrote: The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. This sounds very problematic to me. There are many standard software bugs that are much more critical to the enterprise than just security bugs. It seems foolhardy to do risk assessment on security bugs only - I think we need to bring the worlds of software development and security analysis together more. Divided we (continue to) fail. And Gary, this is not a critique of just your comment, but of WebAppSec at large. - Jim - Original Message - From: Gary McGraw g...@cigital.com To: Steven M. Christey co...@linus.mitre.org Cc: Sammy Migues smig...@cigital.com; Michael Cohen mco...@cigital.com; Dustin Sullivan dustin.sulli...@informit.com; Secure Code Mailing List SC-L@securecoding.org Sent: Thursday, March 19, 2009 2:50 AM Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT) Hi Steve, Sorry for falling off the thread last night. Waiting for the posts to clear was not a great idea. The top N lists we observed among the 9 were BUG lists only. So that means that in general at least half of the defects were not being identified on the most wanted list using that BSIMM set of activities. You are correct to point out that the Architecture Analysis practice has other activities meant to ferret out those sorts of flaws. I asked my guys to work on a flaws collection a while ago, but I have not seen anything yet. Canuck? There is an important difference between your CVE data which is based on externally observed bugs (imposed on vendors by security types mostly) and internal bug data determined using static analysis or internal testing. I would be very interested to know whether Microsoft and the CVE consider the same bug #1 on internal versus external rating systems. The difference is in the term reported for versus discovered internally during SDL activity. gem http://www.cigital.com/~gem On 3/18/09 6:14 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. Interesting. Does this mean that their top N lists are less likely to include design flaws? (though they would be covered under various other BSIMM activities). After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux vendors or most any product with a long history, and their current number 1's are not the same as the number 1's of the past. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
That's a bit of dodging the question, I'd like to hear more. You comment below implied that it was your consistent use of vendor-based static analyis tool that allowed you to figure out top N list of bugs for a specific organization. Leading with static analysis as your primary analysis driver concearns me. Will you elaborate, please? - Jim - Original Message - From: Gary McGraw g...@cigital.com To: Jim Manico j...@manico.net; Steven M. Christey co...@linus.mitre.org Cc: Sammy Migues smig...@cigital.com; Dustin Sullivan dustin.sulli...@informit.com; Secure Code Mailing List SC-L@securecoding.org Sent: Thursday, March 19, 2009 9:04 AM Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT) Actually no. See: http://www.cigital.com/papers/download/j15bsi.pdf (John Steven, State of Application Assessment, IEEE SP) I am not a tool guy, I am a software security guy. gem http://www.cigital.com/~gem On 3/19/09 2:58 PM, Jim Manico j...@manico.net wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. You mean a real list of what a certain vendors static analysis tools find. If you think that list really measures the risk of an organizations software security posture - that might ne considered to be insane! =) - Jim - Original Message - From: Gary McGraw g...@cigital.com To: Steven M. Christey co...@linus.mitre.org Cc: Sammy Migues smig...@cigital.com; Dustin Sullivan dustin.sulli...@informit.com; Secure Code Mailing List SC-L@securecoding.org Sent: Wednesday, March 18, 2009 11:54 AM Subject: Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT) Hi Steve, Many of the top N lists we encountered were developed through the consistent use of static analysis tools. After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. Other times (like say in the one case where the study participant did not believe in static analysis for religious reasons) things are a bit more flip (and thus suffer from the no data problem I like to complain about). I do not recall a case when the top N lists were driven by customers. Sorry I missed your talk at the SWA forum. I'll chalk that one up to NoVa traffic. gem http://www.cigital.com/~gem On 3/18/09 5:47 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Because it is about building a top N list FOR A PARTICULAR ORGANIZATION. You and I have discussed this many times. The generic top 25 is unlikely to apply to any particular organization. The notion of using that as a driver for software purchasing is insane. On the other hand if organization X knows what THEIR top 10 bugs are, that has real value. Got it, thanks. I guessed as much. Did you investigate whether the developers' personal top-N lists were consistent with what their customers cared about? How did the developers go about selecting them? By the way, last week in my OWASP Software Assurance Day talk on the Top 25, I had a slide on the role of top-N lists in BSIMM, where I attempted to say basically the same thing. This was after various slides that tried to emphasize how the current Top 25 is both incomplete and not necessarily fully relevant to a particular organization's needs. So while the message may have been diluted during initial publication, it's being refined somewhat. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___ ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
[SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
hi sc-l, The BSIMM is a sizeable document, so digesting it all at once can be a challenge. My monthly informIT column this month explains the BSIMM in a much easier to digest, shorter form. The article is co-authored by Brian and Sammy. BSIMM: Confessions of an Alchemist http://www.informit.com/articles/article.aspx?p=1332285 Dons asbestos suit from the 80s flame wars We had a great time writing this one. Here is my favorite paragraph (in the science versus alchemy vein): Both early phases of software security made use of any sort of argument or 'evidence' to bolster the software security message, and that was fine given the starting point. We had lots of examples, plenty of good intuition, and the best of intentions. But now the time has come to put away the bug parade boogeyman, the top 25 tea leaves, black box web app goat sacrifice, and the occult reading of pen testing entrails. The time for science is upon us. John Waters also wrote a nice piece on the BSIMM that appeared today: http://visualstudiomagazine.com/news/article.aspx?editorialsid=10689 To download the complete model, see http://bsi-mm.com gem company www.cigital.com podcast www.cigital.com/silverbullet podcast www.cigital.com/realitycheck blog www.cigital.com/justiceleague book www.swsec.com ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Hi Steve, Because it is about building a top N list FOR A PARTICULAR ORGANIZATION. You and I have discussed this many times. The generic top 25 is unlikely to apply to any particular organization. The notion of using that as a driver for software purchasing is insane. On the other hand if organization X knows what THEIR top 10 bugs are, that has real value. See the examples under that practice. gem On 3/18/09 5:21 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Both early phases of software security made use of any sort of argument or 'evidence' to bolster the software security message, and that was fine given the starting point. We had lots of examples, plenty of good intuition, and the best of intentions. But now the time has come to put away the bug parade boogeyman, the top 25 tea leaves, black box web app goat sacrifice, and the occult reading of pen testing entrails. The time for science is upon us. Given your critique of Top-N lists and bug parades in this paragraph and elsewhere, why is a top N bugs list explicitly identified in BSIMM CR1.1, and partially applicable in places like T1.1, T2.1, SFD2.1, SR1.4, and CR2.1? - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
On Wed, 18 Mar 2009, Gary McGraw wrote: Many of the top N lists we encountered were developed through the consistent use of static analysis tools. Interesting. Does this mean that their top N lists are less likely to include design flaws? (though they would be covered under various other BSIMM activities). After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. I believe this is reflected in public CVE data. Take a look at the bugs that are being reported for, say, Microsoft or major Linux vendors or most any product with a long history, and their current number 1's are not the same as the number 1's of the past. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
Hi Steve, Many of the top N lists we encountered were developed through the consistent use of static analysis tools. After looking at millions of lines of code (sometimes constantly), a ***real*** top N list of bugs emerges for an organization. Eradicating number one is an obvious priority. Training can help. New number one...lather, rinse, repeat. Other times (like say in the one case where the study participant did not believe in static analysis for religious reasons) things are a bit more flip (and thus suffer from the no data problem I like to complain about). I do not recall a case when the top N lists were driven by customers. Sorry I missed your talk at the SWA forum. I'll chalk that one up to NoVa traffic. gem http://www.cigital.com/~gem On 3/18/09 5:47 PM, Steven M. Christey co...@linus.mitre.org wrote: On Wed, 18 Mar 2009, Gary McGraw wrote: Because it is about building a top N list FOR A PARTICULAR ORGANIZATION. You and I have discussed this many times. The generic top 25 is unlikely to apply to any particular organization. The notion of using that as a driver for software purchasing is insane. On the other hand if organization X knows what THEIR top 10 bugs are, that has real value. Got it, thanks. I guessed as much. Did you investigate whether the developers' personal top-N lists were consistent with what their customers cared about? How did the developers go about selecting them? By the way, last week in my OWASP Software Assurance Day talk on the Top 25, I had a slide on the role of top-N lists in BSIMM, where I attempted to say basically the same thing. This was after various slides that tried to emphasize how the current Top 25 is both incomplete and not necessarily fully relevant to a particular organization's needs. So while the message may have been diluted during initial publication, it's being refined somewhat. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
On Wed, 18 Mar 2009, Gary McGraw wrote: Because it is about building a top N list FOR A PARTICULAR ORGANIZATION. You and I have discussed this many times. The generic top 25 is unlikely to apply to any particular organization. The notion of using that as a driver for software purchasing is insane. On the other hand if organization X knows what THEIR top 10 bugs are, that has real value. Got it, thanks. I guessed as much. Did you investigate whether the developers' personal top-N lists were consistent with what their customers cared about? How did the developers go about selecting them? By the way, last week in my OWASP Software Assurance Day talk on the Top 25, I had a slide on the role of top-N lists in BSIMM, where I attempted to say basically the same thing. This was after various slides that tried to emphasize how the current Top 25 is both incomplete and not necessarily fully relevant to a particular organization's needs. So while the message may have been diluted during initial publication, it's being refined somewhat. - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___
Re: [SC-L] BSIMM: Confessions of a Software Security Alchemist (informIT)
On Wed, 18 Mar 2009, Gary McGraw wrote: Both early phases of software security made use of any sort of argument or 'evidence' to bolster the software security message, and that was fine given the starting point. We had lots of examples, plenty of good intuition, and the best of intentions. But now the time has come to put away the bug parade boogeyman, the top 25 tea leaves, black box web app goat sacrifice, and the occult reading of pen testing entrails. The time for science is upon us. Given your critique of Top-N lists and bug parades in this paragraph and elsewhere, why is a top N bugs list explicitly identified in BSIMM CR1.1, and partially applicable in places like T1.1, T2.1, SFD2.1, SR1.4, and CR2.1? - Steve ___ Secure Coding mailing list (SC-L) SC-L@securecoding.org List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l List charter available at - http://www.securecoding.org/list/charter.php SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com) as a free, non-commercial service to the software security community. ___