Re: [SC-L] OWASP Publicity

2007-11-19 Thread James Stibbards
Ben,

Good comments.  It may be true that "older" technology is what today's Sr
Managers have the most familiarity with, however... In my opinion, it's not
that familiarity that we (or they) should rely on, in order to be
well-informed, and thus be making good security-related decisions. It's no
longer their job to be into the details of that technology, unless they are
the CTO (for example), and if they are into the details... That's actually a
red flag to me that they're likely *not* doing their actual job, today, as a
Sr. Manager.  [Slight rant: It *is* the responsibility of the management
team of the organization, overall, to be sure that information which is
critical to the organization be conveyed, abstracted or not, up and down the
layers of the entire omanagement and individual contributors... to
accomplish whatever organizational goals exist. (see more, below).]

If a Sr Manager was once familiar with COBOL (I chuckled at the recent COBOL
SC-L postings...), but the issues are now WinMobile and AJAX, then it's
really the responsibility of someone in the organzation to have synthesized
and presented  the security issues, opportunities, and costs as they relate
to WinMobile/AJAX/etc. to senior management as Business Issues. At other
layers in the organization, yes, there are Technology issues, concerns, joy
and grief... But not at the Executive levels, because that's not their
job(!).

As an aside...since security means so many things to so many people, here is
a 4-layer model that I use with a lot of my customer to help position what
we do, in the "vast" landscape of security:

 1. Business/Mission objectives - what are "we" trying to accomplish?
 2. Systems Architecture - how is this being instantiated, in terms of
systems, communication, storage, etc?
 3. Security Architecture - what specific technology and processes are we
using to reduce risk, introduce control mechanisms, etc.
 4. Protection Technology - how do we lock down the #3, so it can be
resistant to attack, itself.

I've used this over and over again, in helping to frame discussions of what
"should" or "could" be done, so that they're not confusing.  For example, a
question of policy with a question of choice of technology selection.

A few days early, but Happy Thanksgiving, to all!

- James

James W. Stibbards
Sr. Director - Sales Engineering
Cloakware, Inc.
email: [EMAIL PROTECTED] 
phone: 703-752-4836
cell: 571-232-7210


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Benjamin Tomhave
Sent: Sunday, November 18, 2007 10:08 AM
To: Secure Coding
Subject: Re: [SC-L] OWASP Publicity

I agree and disagree with these comments, as I think they possibly represent
an outmoded way of thinking when it comes to IT management.
Execs and senior mgmt _must_ have a certain understanding of security that
will at least give them a basis for making risk decisions. It seems today
that they are fine (generally) making business risk decisions, but then
believe falsely that making an IT risk decision requires following a
completely different set of rules (when, in fact, it's just another kind of
business risk decision). I'm of the belief that this directly correlates to
their lack of fundamental understanding of IT and security issues.

Where I agree is the level of detail that needs to be imparted. OWASP Top 10
is probably too much detail to communicate to the average exec or sr
manager. However, we must not overlook that these business leaders were once
individual contributors. Yes, it's true that some of these folks came up
through a strictly business route, but for the most part these days I see
these careers originating in at least a semi-technical role. We should be
seeking to leverage those backgrounds to educate them and bring them to
modern times.

On Crispin's later comments about bad vs good managers, I think he's very
much hit the nail on the head (see the quote in my sig). However, there's
one aspect that's overlooked, which is outdated prior history.
If an executive's understanding of technology is founded in their first
contributions as an individual contributor 10-20 years ago, then this means
their understanding of modern technology may be severely limited.
I'm sure all of us understand how difficult it is to stay on top of current
trends as technology evolves, and it's often our job to do so.
What if it's not your job to keep current? The times will change while your
focus is elsewhere, but only a truly savvy person will think to check that
context before making decisions that affect it. This seems to be a rarity.

So, to conclude, I think that it would be valuable, in broad brush strokes,
to educate leaders about secure coding - and security in general - but
perhaps not to the level of detail we might really desire to see. We want
execs and sr managers to drive their folks toward secure coding practices,
but that doesn't mean they themselves have to know how to code securely. As
such, in targeti

Re: [SC-L] Harvard vs. von Neumann

2007-06-14 Thread James Stibbards
Hi Gary (good to see you at Gartner, BTW), 

I recall way back in the bad old days of the Orange Book that we used to
look for both Developmental Assurance and (emphasis here) Operational
Assurance.  To that end, systems are designed and implemented with certain
limitations or "assumptions" (shudder) about how they'll be operated.  In
terms of security, that might include that "unfriendlies" are not allow
physical access to the box itself, or other constraints. Perhaps this
"operational" aspect of things is just a part of implementation... But I've
seen several systems that were designed well, implemented well, and when
operated poorly (e.g. training didn't go over the necessary "secure boot"
operations), rendered the design and implementation moot.

Comments?

Best,
- James

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Gary McGraw
Sent: Wednesday, June 13, 2007 8:59 PM
To: 'Crispin Cowan'
Cc: 'SC-L@securecoding.org'; 'Blue Boar'
Subject: Re: [SC-L] Harvard vs. von Neumann

I am reminded of a (bottle of wine induced) argument I once had with dan
geer over whether a buffer overflow is a bug or a flaw.   We ultimately
realized that I was sitting in the app code looking at strcpy() and dan was
thinking of language architecture on a machine with innane memory layout.
We were both right...kind of.   Thing is, when it comes to misuse of really
pathetic string functions in C, most developers make bugs...

Of course there is a deep relation between bugs and flaws.   Unfortunately,
most software security discourse these days is stuck in the Web app bugs
only mud.  Any acknowledgement of higher level thinking is a good thing.

gem

company www.cigital.com
podcast www.cigital.com/silverbullet
blog www.cigital.com/justiceleague
book www.swsec.com



Sent from my treo.

 -Original Message-
From:   Crispin Cowan [mailto:[EMAIL PROTECTED]
Sent:   Monday, June 11, 2007 05:50 PM Eastern Standard Time
To: Gary McGraw
Cc: Blue Boar; SC-L@securecoding.org
Subject:Re: [SC-L] Harvard vs. von Neumann

Gary McGraw wrote:
> Though I don't quite understand computer science theory in the same way
that Crispin does, I do think it is worth pointing out that there are two
major kinds of security defects in software: bugs at the implementation
level, and flaws at the design/spec level.  I think Crispin is driving at
that point.
>
Kind of. I'm saying that "specification" and "implementation" are relative
to each other: at one level, a spec can say "put an iterative loop here" and
implementation of a bunch of x86 instructions. At another level,
specification says "initialize this array" and the implementation says "for
(i=0; i If we assumed perfection at the implementation level (through better
languages, say), then we would end up solving roughly 50% of the software
security problem.
>
The 50% being rather squishy, but yes this is true. Its only vaguely what I
was talking about, really, but it is true.

Crispin

--
Crispin Cowan, Ph.D.   http://crispincowan.com/~crispin/
Director of Software Engineering   http://novell.com
AppArmor Chat: irc.oftc.net/#apparmor


___
Secure Coding mailing list (SC-L) SC-L@securecoding.org List information,
subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___

___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


RE: [SC-L] Bugs and flaws

2006-02-03 Thread James Stibbards
Hi Gary,

In one of your prior posts you mentioned documentation.  I believe that the
problem with WMF was that someone had not examined WMF as a postential
source of vulnerabilities, since the embedded code was an legacy capability.

My belief is that one of the keys to finding flaws lies in the proper
capture of the requirements/contract of a software component, and then
examining and testing against that. Without the proper requirements that
speak clearly to security,  we can inspect and examine, but we won't know
what we're measuring against.  That doesn't solve the problem of knowing
when we're done, I realize.

See you at SSS.
- James

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Gary McGraw
Sent: Friday, February 03, 2006 11:13 AM
To: Kenneth R. van Wyk; Secure Coding Mailing List
Subject: RE: [SC-L] Bugs and flaws

To cycle this all back around to the original posting, lets talk about the
WMF flaw in particular.  Do we believe that the best way for Microsoft to
find similar design problems is to do code review?  Or should they use a
higher level approach?

Were they correct in saying (officially) that flaws such as WMF are hard to
anticipate? 

gem




This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.


___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php

___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php


[SC-L] Managing the insider threat through code obfuscation

2005-12-15 Thread James Stibbards
Hi Jeremy (and Ken), 

Obfuscation of Java bytecode (like other "machine-level" instruction sets)
will ultimately depend on what level of hiding is being done. Principally,
whether
you're really just scattering the data (i.e. using a secret scatter
algorithm),
or actually encrypt/decrypting it, it then moves the problem to being one of

secure key gen/storage/usage.  

My company - Cloakware - is firmly planted in the "software protection"
field, focusing on hardening of Digital Rights Management and other content
and
IP protection system, for our customers.  We have IP in the area of secure
key 
storage, called "White Box Crypto", which solves the probem well for normal 
(read: all but the intel-grade needs).  We use a variety of techniques -
currently 
in C and C++, and will be extending that to include Java shortly.  The
techniques 
have varying impacts on code-size and/or performance-preserving, with
associated 
tradeoffs against security level/strength of mechanism.

As you point out - if the goal is to hide the data and/or implementation at
runtime,
then traditional support can be a problem.  Stack traces - if you can get
them from
within an "anti-debug" enabled application or driver, may tell you very
little (by design).
I do know of one "obfuscator" that also has a free downloadable
"anti-obfuscator" 
tool available so that customers can debug their applications.  Make me
wonder, 
what they were thinking... that attackers wouldn't notice?

Let me know if you'd like to be part of the discussion as our Java work
rolls out,
and I'll set up a briefing.

Hoping this is an "acceptably tasteful" posting from a vendor...  :)

Regards,
- James

James W. Stibbards
Director, Systems Engineering
Cloakware Federal
"secure software ... from the inside out"
703.752.4836  office
571.232.7210  cell

Previous message: 



Jeremy Epstein jeremy.epstein at webmethods.com 
Thu Dec 15 10:00:38 EST 2005 

Ken,

I looked into this a couple of years ago to protect against intellectual
property theft (e.g., reverse engineering) and to make it harder to bypass
software licensing techniques.  My conclusion at that point was that the
obfuscation didn't actually do much good (it was still fairly easy to figure
out what was going on).  It introduced an extra risky step - our developers
want to do their debugging/QA on unobfuscated versions so they can figure
out what goes wrong, but you then have to replicate all of your QA on the
obfuscated version to make sure that the obfuscator didn't break anything.
[I hope that no one would test one version and release another!]  And if
there was a discrepancy, it was likely to be difficult to find what went
wrong.

Most importantly for us, it made support a royal pain - stack traces no
longer meant anything.  And we had to be *very* careful not to obfuscate any
published or undocumented-but-known interfaces.

My conclusion is that it's better than just marketing hooey - there is some
technical advantage - but that if you have an extensible product and/or you
have to provide support, the pain is worse than the advantage.

--Jeremy

___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php