Re: [SC-L] SearchSecurity: Cyber Security and the Law

2012-08-02 Thread Greg Beeley
How would we recognize good engineering?

It seems to me like the very same problem faced by the idea of software
liability law - that it is hard to define good engineering for software
security - would be faced by an incentive program.  If good
engineering is fuzzy enough to give a big corporate legal dept the
upper hand against an individual, wouldn't it be similarly fuzzy enough
to counter the fairness of a tax incentive?

Tax breaks are a big deal - I doubt the government is going to want to
issue tax breaks to a company because the company claims they have
achieved level X in a CMM -- think about the economic cost in
demonstrating something like that to the point where it is fair and
worth something.  I also doubt that a metric based on vulnerability
counts will work -- that will just encourage companies to hide
vulnerabilities, fixing them silently and/or with great delay, instead
of disclosing them.

Not that I think that incentives inherently wouldn't work -- rather I'd
be interested in seeing some discussion here on some of the above issues.

One alternative that has worked well in many other areas of
manufacturing -- encourage some kind of limited warranty, at least in
certain industries.  For consumer mobile devices, it might be something
as simple as, if your device's security is ever compromised due to a
flaw in the bundled device software, we'll repair it free of charge.
The big challenges are 1) getting customers to care about their device's
security, and 2) making a vendor's commitment to security recognizable
by the customer.  By no means ideal, but at least a talking point.

- Greg

Gary McGraw wrote, On 08/02/2012 08:40 AM:
 Hi Jeff,
 
 I'm afraid I disagree.  The hyperbolic way to state this is, imagine YOUR
 lawyer faced down by Microsoft's army of lawyers. You lose.
 
 Software liability is not the way to go in my opinion.  Instead, I would
 like to see the government develop incentives for good engineering.
 
 gem
 
 On 8/2/12 10:26 AM, Jeffrey Walton noloa...@gmail.com wrote:
 
 Hi Dr. McGraw,

 Cyber Intelligence Sharing and Protection Act (CISPA) passed by
 there House in April) has very little to say about building security in.
 I'm convinced (in the US) that users/consumers need a comprehensive
 set of software liability laws. Consider the number of mobile devices
 that are vulnerable because OEMs stopped providing (or never provided)
 patches for vulnerabilities. The equation [risk analysis] needs to be
 unbalanced just a bit to get manufacturers to act (do nothing is cost
 effective at the moment).

 Jeff

 On Wed, Aug 1, 2012 at 10:28 AM, Gary McGraw g...@cigital.com wrote:
 hi sc-l,

 This month's [in]security article takes on Cyber Law as its topic.  The
 US Congress has been debating a cyber security bill this session and is
 close to passing something.  Sadly, the Cybersecurity and Internet
 Freedom Act currently being considered in the Senate (as an answer to
 the problematic  Cyber Intelligence Sharing and Protection Act (CISPA)
 passed by there House in April) has very little to say about building
 security in.

 Though cyber law has always lagged technical reality by several years,
 ignoring the notion of building security in is a fundamental flaw.


 http://searchsecurity.techtarget.com/opinion/Congress-should-encourage-bu
 g-fixes-reward-secure-systems

 Please read this month's article and pass it on far and wide.  Send a
 copy to your representatives in all branches of government.  It is high
 time for the government to tune in to cyber security properly.

 
 
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org
 List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
 List charter available at - http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
 as a free, non-commercial service to the software security community.
 Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
 ___
 
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] BSIMM3 lives

2011-10-22 Thread Greg Beeley
Gary,

Could you clarify your (and/or the BSIMM) position on secure by design
vs designed to be secure?  You're encouraging the adoption of
secure-by-design building blocks, as a part of SFD2.1, but then warning
that designed to be secure != secure.  I can think of examples/ways
that what you've said can be true, but am not sure what you're actually
referring to.

Of course we all know that all systems have design and implementation
defects, though solid processes can significantly reduce the number of
those.  And we all can think of plenty of examples of security add-ons
that have actually worsened the true vulnerability of the resulting
software system.

From my perspective, there are a lot of security frameworks out there
that help software engineers do the same thing more securely, and then
there are approaches that fundamentally change the way the thing is
done.  One example might be giving someone a better strcpy() on the one
hand, versus entirely swapping out their imperative programming paradigm
for a more declarative one.

Thanks,

- Greg

Gary McGraw wrote, On 10/21/2011 11:14 AM:
 The particular BSIMM activity in questions is SFD2.1 (one of the 109 BSIMM
 activities).  Here is its description from page 27 of the BSIMM:
 SFD2.1: **Build secure-by-design middleware frameworks/common libraries.**
 The SSG takes a proactive role in software design by building or providing
 pointers to secure-by-design middleware frameworks or common libraries.

...

 What is implied is
 a warning that even things designed to be secure often may not be
 (buyer...or cut-n-paster...beware).
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] [WEB SECURITY] Re: What do you like better Web penetration testing or static code analysis?

2010-05-05 Thread Greg Beeley
Regarding the code snippet -- it does depend on the environment -- point well
taken.  But in this case (from what I can tell), unless you actually have the
file_exists() function *disabled* in php.ini, this is vulnerable to XSS.

- Greg

Sebastian Schinzel wrote, On 04/28/2010 04:03 AM:
 On Apr 28, 2010, at 7:10 AM, SneakySimian wrote:
 ?php
 $file = $_GET['file'];

 if(file_exists($file))
 {
 echo $file;
 }

 else
 {
echo 'File not found. :(';
 }

 Ignoring the other blatant issues with that code snippet, is that
 vulnerable to XSS? No? Are you sure? Yes? Can you prove it? As it
 turns out, it depends on a configuration setting in php.ini. The only
 real way to know if it is an issue is to run it in the environment it
 is meant to be run in. Now, that's not to say that the developer who
 wrote that code shouldn't be told to fix it in a source code analysis,
 but the point is, some issues are wholly dependent on the environment
 and may or may not get caught during code analysis. Other issues such
 as code branches that don't execute or do execute in certain
 environments can be problematic to spot during normal source code
 analysis.
 
 So you suggest to actually perform n black-box tests where n is the set
 of all possible permutations of all variables in php.ini (hint: n will
 be very
 large)? This is certainly not feasible.
 
 Your code shows a very simple data flow, which may or may not be
 exploitable. But this is not the point. The point of software security
 is to
 increase the reliability of the software when under attack.
 
 Reliable software performs output encoding when user input is printed
 to HTML. This code does not perform output encoding and should therefore
 be fixed.
 
 The discussion about whether or not this is exploitable on which platforms
 is a waste of time. In many cases, you will find yourself spending a lot of
 time in trying to get a running exploit, whereas the actual fix for the
 code
 takes a fraction of the time.
 
 For me, penetration testing is solely a method to raise awareness and to
 gather new
 security requirements FOR a customer application FROM security researchers.
 Knowledge transfer from security researchers to the business is key here.
 It helps finding actual attacks but does not help the customer writing
 better
 code.
 
 Code audits (where automated or manual) are the way to go to improve
 reliability by pointing out dangerous coding patterns.
 
 My 0.02€...
 
 Cheers
 Sebastian
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org
 List information, subscriptions, etc -
 http://krvw.com/mailman/listinfo/sc-l
 List charter available at - http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
 as a free, non-commercial service to the software security community.
 Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
 ___
 
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
Follow KRvW Associates on Twitter at: http://twitter.com/KRvW_Associates
___


Re: [SC-L] blog post and open source vulnerabilities to blog about

2010-03-17 Thread Greg Beeley
Matt,

You can find quite a list of OSS vulnerabilities over an CVE (cve.mitre.org)
or NVD (nvd.nist.gov), but here are a couple ones that I tend to use for
illustrative purposes when teaching.

- Apache Chunked Encoding vuln (#CVE-2002-0392), an integer overflow.  Of
particular interest because when it was first discovered it was not believed
to be exploitable to gain remote root, but due to a nuance in a memcpy() /
memmove() implementation, it was (I think I'm remembering this right).  An
example that non-exploitability depends on more than just the program itself,
but also on the underlying systems (libraries, compiler, hardware, etc).

- OpenSSH crc32 compensation attack detector vulnerability (#CVE-2001-0144).
Of interest because this was a remote-root vulnerability in a piece of code
that was used solely to try to thwart an SSH protocol 1 cryptographic attack.
A good example of more code introducing more bugs, even when the more code
had an important security purpose.

- Never made it into any distributed code, as it was in version control only,
but there was a Linux kernel vulnerability that was a backdoor attempt.
(http://kerneltrap.org/node/1584). Of interest because it was apparently an
intentional typo bug to create a backdoor.  A good example of something that
could have easily slid by, but the way that version control was set up as well
as the many eyes working on the kernel, resulted in it coming to light quickly.

- A sendmail bug publicized back in 2006 (#CVE-2006-0058) was of interest
because the vulnerability was not a typical buffer overflow, but was due to
(if I remember correctly -- the discussion of this vuln was pretty opaque at
the time, so I could be wrong on this) the intermixing of static and automatic
C function variables in a fairly complex attack scenario (where a residual
static pointer was pointing to a previous incarnation of an automatic buffer),
resulting in an attacker being able to overwrite a section of the stack if the
attack was timed just right (it didn't need the nanosecond precision that
was widely publicized at first).  A good example of complex code being more
difficult to secure.

- Greg Beeley
  LightSys

Matt Parsons wrote, On 03/16/2010 10:41 AM:
  
 
 Hello,
 
 I am working on a software security blog and I am trying to find open
 source vulnerabilities to present and share.  Does anyone else have any
 open source vulnerabilities that they could share and talk about?   I
 think this could be the best way to learn in the open source community
 about security.   I have a few but I would like to blog about a
 different piece of code almost every day.  
 
  
 
 God Bless.
 Matt
 
  
 
  
 
 http://parsonsisconsulting.blogspot.com/
 
  
 
  
 
 Matt Parsons, MSM, CISSP
 
 315-559-3588 Blackberry
 
 817-294-3789 Home office
 
 Do Good and Fear No Man 
 
 Fort Worth, Texas
 
 A.K.A The Keyboard Cowboy
 
 mailto:mparsons1...@gmail.com
 
 http://www.parsonsisconsulting.com
 
 http://www.o2-ounceopen.com/o2-power-users/
 
 http://www.linkedin.com/in/parsonsconsulting
 
 http://parsonsisconsulting.blogspot.com/
 
 http://www.vimeo.com/8939668
 
  
 
 0_0_0_0_250_281_csupload_6117291
 
  
 
 untitled
 
  
 
  
 
  
 
  
 
  
 
  
 
  
 
 
 
 
 ___
 Secure Coding mailing list (SC-L) SC-L@securecoding.org
 List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
 List charter available at - http://www.securecoding.org/list/charter.php
 SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
 as a free, non-commercial service to the software security community.
 ___
___
Secure Coding mailing list (SC-L) SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php
SC-L is hosted and moderated by KRvW Associates, LLC (http://www.KRvW.com)
as a free, non-commercial service to the software security community.
___


Re: [SC-L] Fwd: re-writing college books - erm.. ahm...

2006-11-07 Thread Greg Beeley
Hi all,

I've been watching this discussion with interest, as I've taught a
undergrad-level course a couple of times that focuses on infosec
with a concentration in software security.  Yes, _Secure Coding_
was one of the books we used :)

A few observations from my experience so far:

   - Sure, we can teach don't overflow the buffer in lower division
 undergrad courses, but many students won't understand the
 reasons why this results in an exploitable condition, since those
 reasons require understanding concepts that are not normally taught
 until the upper division of undergrad CS.

   - I think we need to not only give the students the right *tools*
 to code securely, but also the right *mindset*.  It is harder
 to teach the mindset in the earlier courses.

   - As for a specialized course on software security, it can be
 tricky working it into the undergrad CS curriculum.  When I've
 taught this material, I could not assume (for instance) a
 certain degree of student knowledge about computer architecture
 and the way the call stack works.  I had to explain that stuff
 just to be able to explain how a buffer overflow works, for instance.

   - We can teach, be more secure, use Java/C#/etc instead of C,
 and that is good, but remember that these students are going
 out into the real workforce and will use the language(s)
 chosen by their employers (or already in place on an existing
 product line).  I do believe that students still need to know
 how to use C/C++ responsibly.  Otherwise, they may very well
 be ill-prepared for the real world :)

   - As for vocational vs. academic, I think there's a lot of room
 for software security in each.  At the academic level, you
 spend more time explaining the underlying concepts.  For
 example, teaching why having a call stack share data and program
 flow control constructs tends to cause trouble (when no enforcement
 of the bounds of data and control is performed).  Vocational
 teaching is much more hands-on and tools oriented.  At the
 academic level, you want your students to be able to take the
 knowledge and apply it in new and creative ways, not just learn
 a tool or a technique.

   - Many universities want to teach in the academic world the kind
 of knowledge that will give their students a definite edge when
 they go into private industry.  If potential employers (or
 graduate programs, etc.) look favorably on some software security
 experience, we will probably see more of it taught and/or
 integrated into existing coursework.

   - I found Corewars to be an interesting tool for starting to
 exercise that defensive coding muscle.  It gets students used
 to assuming that their program will be abused and misused,
 among other things :)

Greg.


Greg Beeley, President  Co-Founder [EMAIL PROTECTED]
LightSys Technology Services, Inc.  http://www.LightSys.org/


___
Secure Coding mailing list (SC-L)
SC-L@securecoding.org
List information, subscriptions, etc - http://krvw.com/mailman/listinfo/sc-l
List charter available at - http://www.securecoding.org/list/charter.php