Thanks for that excellent and detailed response, Steve. A few follow-up
1) What sort of charter and executive support was/is necessary to
establish a group like SSG, and to continue building on it? In
particular, I wonder about how the mandate was established, and then
supported over the year(s)?

2) How long has it taken to go from no SSG to a well-functioning SSG?
What sort of ramp-up time was needed?

3) Is there a way to capture the SSG and IR approach into some sort of
copy-n-paste model or framework that could be used to expedite
replication within other orgs? Is it even realistic to think that this
approach can be easily replicated? (somewhat ties back into
mandate/support, I suppose)

Thank you,


Benjamin Tomhave, MS, CISSP

[ Random Quote: ]
"Any sufficiently advanced technology is indistinguishable from magic."
Arthur C. Clarke

Steve Lipner wrote:
> Thanks for looping me in Gary - this is an interesting topic and one
> that I do have an opinion about (no surprise).  (My apologies for the
> long delay responding - holidays and vacation rather than lack of
> interest).
> Microsoft created a central IR group before I joined the company, and
> we feel very strongly that the IR model we have is the right one.  As
> things have evolved today, we centralize response policy, customer
> communications, security researcher (vulnerability finder)
> communications, press outreach, emergency response, vulnerability and
> fix security analysis, and response to complex (multi-product)
> vulnerabilities.    Individual product teams are responsible for
> building secure code, and if a vulnerability is found they're
> responsible for fixing it.  But we have central policies on update
> packaging and release so that we protect all customers in the same
> way and administrators don't have to learn a separate patching
> process for each product.
> For IR, the central group lets us speak with one voice to communities
> who would be confused or bothered by having to deal with a batch of
> "different Microsofts."  It also allows us to build a team that can
> specialize in deep technical security analysis - how bad is the
> impact this bug, is it an instance of a broader pattern that we
> should be searching for, does the proposed fix in fact fix the
> underlying problem?  Answering those questions is specialized work,
> and it's unreasonable to expect that any but the biggest product
> groups could build or retain the necessary capability.  I could share
> horror stories from 2000 and 2001 when really competent developers
> told me that a vulnerability wasn't exploitable to run code and then
> we found that it was.  The competence to do that sort of analysis is
> a lot scarcer than the competence to write secure code - but
> vulnerabilities still arise, and the IR team needs the scarce kind of
> competence to plan the right response.
> For secure development, as I said above, we expect product teams to
> design and build secure software.  But for a company like Microsoft
> with a central commitment to security, a central SSG allows us to
> develop and enforce common policies for what "secure" means.  That's
> part of achieving the accountability ("consequences") Ben refers to
> below.  The SSG is also the place where we update policies and
> requirements, and develop new tools and techniques.  The SDL tools
> (and guidance) that we've released at over the
> last year or so have all been developed in our SSG - most for our own
> in-house use.  The SSG serves as a consulting resource on secure
> design - an area where the "right" solution will vary from product to
> product, and where it's beneficial for product teams to be able to
> appeal to folks who are focused on security and see a lot of problems
> and solutions.  The final key role of the SSG is to provide the link
> back to vulnerability research (again something that only the biggest
> product teams could afford).  Our SSG (Microsoft Security Engineering
> Center or MSEC) is part of the same organization as our IR team
> (Microsoft Security Response Center or MSRC) and we work closely as
> we're analyzing new security vulnerabilities and working to build on
> them, find patterns, and identify and remove new classes of
> vulnerabilities.  The idea (I used to think it came from Earl Boebert
> but I believe he was quoting Rick Proto) is that theories of security
> come from theories of insecurity.
> We fully understand the need to keep MSEC from getting too large.
> Our budget and planning process are part of our answer to doing that
> - we're essentially a support function, and nobody wants that to get
> too large.  We also avoid taking on tasks that should fall to product
> groups - I guess I could imagine a negative spiral where we did more
> and more of the work that product groups should do, and then had to
> get larger and larger as product teams did less and MSEC had to
> compensate.  But the culture of MSEC is to help the product groups
> with tools, training, consulting and process and hold them
> accountable - not to do their job.  And the product groups also want
> to control their own fates (and code) so they accept the
> responsibility that comes with that model.
> Steve
> -----Original Message----- From: DESAUSOI Alain
> [] Sent: Monday, January 04, 2010 7:23
> AM To: Gary McGraw; Secure Code Mailing List; Steve Lipner Cc:
>; David Ladd Subject: RE: [SC-L]
> InformIT: You need an SSG
> [now posted on sc-l]
> I agree that in an ideal world, security would be naturally built-in
> by delivery and operational teams. They would transparently (?)
> report deviations/risks to central (ERM) team that would handle and
> escalate as appropriate.
>> From my own experience, SWIFT's SSG helps ensure 'right the first
>> time' and 'continuous improvements' on security front. It provides
>> management transparency on risks as well as upstream assurance on
>> what and how things will be/are being done. And it only brings
>> value in as much there is tight 2-way cooperation with delivery and
>> operational teams.
> With the support of senior executives, SSG allows a more natural
> cross fertilisation of risks, threats and solutions across the board,
> in terms of learning, need for infrastructure initiatives, ... Also,
> very much like using external penetration testers provides 'wilder'
> and more update testing techniques, having dedicated SSG helps having
> and outside look and keeping threat/risk management knowledge up to
> date.
> Could the ideal world succeed ?  Probably so, assuming security
> governance, transparency, assurance and discipline.
> I have no experience with large scale organisation (SWIFT is only 500
> developers - SSG is 5 staffs) and we already have the challenge of
> choosing our battles. So, even in those areas that we pamper less,
> the key values I preserve are (1) the ability to provide 'before the
> fact' security assurance on critical projects (critical in terms of
> businesses introducing new threats or arising security challenges)
> (2) sharing findings and lessons learned across the organisation (3)
> some level of assurance by keeping up with penetration testing as a
> motivator and (weak) indicator of alignment.
> In favour of dissemination is the fact that, according to my own
> experience (I have been 15 years on the other side of the fence), SSG
> only really works if they are "on the field" and well accepted by
> delivery/operational teams. The flip side is the need for a strong
> (and harmonised) security culture as it might easily drift away
> according to different business unit leadership: I would think that
> this would be a real challenge for the CISO. This also reflects my
> own experience on the delivery side.
> In summary, SSG provides down-the-earth facts and figures helping
> management making risk balance instead of leaving delivery teams on
> their own. It provides real meat for balancing security measures and
> learning as time goes. I think a reasonably-sized SSG with
> 'satellite' for granularity and scalability is the most appealing
> solution.
>> -----Original Message----- From: Gary McGraw
>> [] Sent: Wednesday, December 23, 2009 3:05 PM
>>  To: Secure Code Mailing List; Steven Lipner; DESAUSOI Alain Cc:
>>; David Ladd Subject: Re: [SC-L]
>> InformIT: You need an SSG
>> Hi ben,
>> I would be very much interested in Steve Lipner's opinion here,
>> because Steve ran the IR program at Microsoft a decade ago before
>> he was recruited to lead the SSG.  Steve, if you would, please take
>> a look at this thread and let us know what your thinking is RE
>> integrating an IR group and/or an SSG completely and almost
>> invisibly into the mother organization after it has matured.
>> I suspect that some core SSG will always remain even after much of
>> the heavy lifting has been distributed throughout the mother
>> organization by way of satellite.  A ratio of 3:1 (satellite:SSG)
>> is what we observe.
>> SWIFT also has a long-lived SSG (14 years).  Alain, any insight on
>> this thread you can offer?
>> gem
>> company podcast blog
>> book
>> On 12/22/09 8:56 PM, "Benjamin Tomhave"
>> <> wrote:
>> Hi Gary,
>> I've worked with organizations that have taken a similar approach
>> with incident response management. You have a core IR team (within
>> the security dept) and then you designate IR contacts within
>> specific ops teams. This approach seems to work ok, but
>> coordination gets to be problematic, causing me to question
>> scalability and management. Basically, a lot of effort for an
>> impact that seems to follow a logarithmic growth pattern. The more
>> it grows, the more it costs to manage with not as much
>> cost-effectiveness as when the program started.
>> What I wonder is this: how to do we get from heavy use of
>> "security" SMEs to where "security" is simply SOP? Moreover, is
>> this even a good idea, or particularly realistic? From a CMM
>> perspective, it's akin to our being at Level 0 or 1 right now, with
>> Level 5 being the disappearance of dedicated security personnel
>> because there's no longer a need (achievability aside). It's kind
>> of like dissolving large sugar crystals into water... at first you
>> see the sugar and the water separately... then over time you just
>> see the solution without being able to differentiate sugar from
>> water. Anyway...
>> -ben
>> Gary McGraw wrote:
>>> hi ben,
>>> You may be right.  We have observed that the longer an initiative
>>> is underway (we have one in the study that checks in at 14 years
>>> old), the more actual activity tends to get pushed out to dev.
>>> You may recall from the BSIMM that we call this the satellite.
>>> Microsoft has an extensive satellite with 300 or so people
>>> embedded throughout their huge company (recall that their SSG is
>>> 100).  Because the notion of satellite is not as common a
>>> phenomenon in our data, we can't draw conclusions as clear as the
>>> ones we can draw regarding an SSG.
>>> Think of the SSG and the Satellite as "coaches" and "mentors" who
>>> are tasked with helping development get it right (not simply
>>> cleaning up their messes).  I agree that we have spent too much
>>> effort over the past decade simply trying to assess the mess and
>>> not enough getting dev to change their behavior.   The companies
>>> we studied in the BSIMM are trying to change dev.
>>> As a particular example of where I agree with you that we go off 
>>> track as a discipline, consider training.  Teaching developers
>>> about the OWASP top 10 in training may be exciting, but it is
>>> nowhere near as important or as effective as teaching defensive
>>> programming.  (And this from the guy who wrote "Exploiting
>>> Software.")  Developers need to know how to do it right, not just
>>> what bugs look like on TV.
>>> gem
>>> company podcast blog
>>> book
>>> On 12/22/09 10:11 AM, "Benjamin Tomhave" 
>>> <> wrote:
>>> I think the short-term assertion is sound (setup a group to make
>>> a push in training, awareness, and integration with SOP), but I'm
>>> not convinced the long-term assertion (that is, maintaining the
>>> group past the initial push) is in fact meritorious. I think
>>> there's a danger in setting up dedicated security groups of
>>> almost any sort as it provides a crutch to organizations that
>>> then leads to a failure to integrate security practices into
>>> general SOP.
>>> What is advocated seems to be consistent with how we've
>>> approached security as an industry for the past couple decades
>>> (or longer), and I don't see this as having the long-term benefit
>>> that was desired or intended. It seems that when you don't make
>>> people directly responsible and liable for doing the right
>>> things, they then fail at the ask and let others do it instead.
>>> It's the old "lazy sysadmin" axiom that we script repeatable
>>> tasks because it's easier in the long run.
>>> The question, then, comes down to one of psychology and people 
>>> management. How do we make people responsible for their actions
>>> such that they begin to adopt better practices? The basic
>>> response should be to enact consequences, and I think that now is
>>> probably an optimal time for businesses to get very hard-nosed
>>> about these sorts of things (high unemployment means lots of
>>> people looking for work means employers have the advantage). This
>>> perhaps sounds very ugly and nasty, and obviously it will be if
>>> taken to an extreme, but we have a serious problem culturally in
>>> that non-security people still don't seem to think, on average,
>>> that security is in their job description. Solve that problem,
>>> and all this other stuff becomes a footnote.
>>> fwiw.
>>> -ben
>>> Gary McGraw wrote:
>>>> hi sc-l,
>>>> This list is made up of a bunch of practitioners (more than a 
>>>> thousand from what Ken tells me), and we collectively have many
>>>>  different ways of promoting software security in our companies
>>>> and our clients.  The BSIMM study <> focuses
>>>> attention on software security in large organizations and just
>>>> at the moment covers the work of 1554 full time employees
>>>> working every day in 26 software security initiatives.  One
>>>> phenomenon we observed in the BSIMM was that every large
>>>> initiative has a Software Security Group (SSG) to carry out and
>>>> lead software security activities.
>>>> I wrote about our observations around SSGs in this month's
>>>> informIT article:
>>>> Simply put, an SSG is a critical part of a software security 
>>>> initiative in all companies with more than 100 developers.
>>>> (We're still not sure about SSGs in smaller organizations, but
>>>> the BSIMM Begin data (now hovering at 75 firms) may be
>>>> revealing.)
>>>> Cigital's SSG was formed in 1997 (with John Viega, Brad Arkin,
>>>> and me as founding members).  Since its inception, we've helped
>>>> plan, staff, and carry out ten large software security
>>>> initiatives in customer firms.  One of the most important first
>>>> tasks is establishing an SSG.
>>>> Merry New Year everybody.
>>>> gem
>>>> company podcast
>>>> blog book
>>>> _______________________________________________ Secure Coding 
>>>> mailing list (SC-L) List information, 
>>>> subscriptions, etc - List
>>>>  charter available at -
>>>> SC-L is hosted and
>>>> moderated by KRvW Associates, LLC ( as a
>>>> free, non-commercial service to the software security
>>>> community. _______________________________________________
>>> -- Benjamin Tomhave, MS, CISSP Blog: 
>>> Twitter: 
>>> Photos: 
>>> Web: 
>>> LI: 
>>> [ Random Quote: ] "The only source of knowledge is experience." 
>>> Albert Einstein
>> -- Benjamin Tomhave, MS, CISSP Blog:
>> Twitter:
>> Photos:
>> Web:
>> LI:
>> [ Random Quote: ] Hoare's Law of Large Programs: "Inside every
>> large problem is a small problem struggling to get out." 
Secure Coding mailing list (SC-L)
List information, subscriptions, etc -
List charter available at -
SC-L is hosted and moderated by KRvW Associates, LLC (
as a free, non-commercial service to the software security community.

Reply via email to