RE: [SC-L] ACM Queue article and security education

2004-07-02 Thread Peter Amey


> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED]
> Behalf Of Blue Boar
> Sent: 01 July 2004 21:03
> To: ljknews
> Cc: [EMAIL PROTECTED]
> Subject: Re: [SC-L] ACM Queue article and security education
> 
> 
> ljknews wrote:
> > I think it will be properly considered when the most strict portion
> > of the software world is using language X.   I have used many
> > programs where the flaws in the program make it clear that 
> I care not
> > one whit about whether the authors of that program have 
> opinion about
> > anything I might use. They are simply not competent, either as
> > individuals or else as an organization.
> 
> By "most strict portion", do you mean people that care most about 
> correct code, proofs, and such?  I don't deny that the bulk 
> of the heavy 
> lifting will be done by people well-qualified to do so.  
> However, I'm of 
> the school of thought that certain types of people who like to break 
> things, and whose chief skill is breaking things, will always have a 
> decent shot at finding a problem.  There are people who 
> couldn't build 
> it, but they can sure break it.
> 
> You don't typically get their attention until something is really, 
> really popular.  So yes, you can write your stuff in Language X, and 
> assume it's secure.  It might not actually be until the whole 
> world has 
> had its way with Language X, but (hopefully) that's not a 
> problem.  You 
> can still do the dance of patching the last 5 problems in Language X, 
> and end up better off that if you'd just used C.
> 
> Even Knuth has to write checks ocassionally, and he does a 
> lot of proof 
> work, doesn't he?
> 
> So, if Language X only has 5 problems total, even if it takes 
> years to 
> ferret them out, butthey are fixable, please proceed with getting the 
> whole world to use Language X.
> 

I'm not entirely sure I follow this.  I _think_ you are saying: "since we can't be 
sure that X is perfect (because it might have 5 remaining flaws) then there is no 
point in adopting it".  You seem to be saying that it doesn't matter if X is 
_demonstrably_much_better_ than Y, if it is not perfect then don't change.  Have I got 
that right?

This is a variant on the Goedel gambit often used to attack formal verification.  It 
goes "since Goedel's Theorem places strict limits on what we can formalize and prove, 
let's not bother at all and just do a few unit tests instead".  It also reminds me of 
what I call the asprin analogy: "aspirin doesn't cure cancer so there's no point in 
taking it for headaches".

The reality is that demonstrable improvements in quality, safety and security can be 
achieved by using the right tools, languages and methods.  It is for those who choose 
_not_ to use the strongest engineering to justify their position not the other way 
round.


Peter


**
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.  The IT Department at Praxis Critical Systems can be contacted at 
[EMAIL PROTECTED]
This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.
www.mimesweeper.com
**



This e-mail has been scanned for all viruses by Star Internet. The
service is powered by MessageLabs. For more information on a proactive
anti-virus service working around the clock, around the globe, visit:
http://www.star.net.uk





RE: [SC-L] ACM Queue article and security education

2004-07-02 Thread Peter Amey


> -Original Message-
> From: Blue Boar [mailto:[EMAIL PROTECTED]
> Sent: 01 July 2004 17:11
> To: Peter Amey
> Cc: [EMAIL PROTECTED]
> Subject: Re: [SC-L] ACM Queue article and security education
> 
> 
> Peter Amey wrote:
> > There are languages which are more suitable for the construction of
> > high-integrity systems and have been for years.  We could have
> > adopted Modula-2 back in the 1980s, people could take the 
> blinkers of
> > prejudice off and look properly at Ada.  Yet we continue to use
> > C-derived languages with known weaknesses.
> 
> So we trade the known problems for a set of unknown ones?  

[snip]

A mindset that would have kept us building aircraft in wood!

In any case, we _do_ adopt new languages and methods, frequently.  In the time I have 
been using SPARK I have had people say: "that's neat, if only you could do it for X".  
Where X has been C, C++, Java and C# in that order and at about 5 year intervals.  
What we _don't_ do is make those choices based on any kind of objective assessment or 
engineering judgement.

[snip]

> Language X may very well be a much better starting point, I 
> don't know. 
>   I do believe that it will never be properly looked at until 
> the whole 
> world starts using it for everything, though.

And how will the whole world start using it if everyone waits for everyone else?

In any case, I don't expect the whole world to adopt any one method (any more than I 
build bicycles in carbon fibre even though that is the material of choice for, say, 
racing cars).   What I do expect is that principled engineers, the kind of people who 
care enough about their profession to contribute to groups like this, will seek to use 
the best and most appropriate technology for their purposes.  


Peter


**
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.  The IT Department at Praxis Critical Systems can be contacted at 
[EMAIL PROTECTED]
This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.
www.mimesweeper.com
**



This e-mail has been scanned for all viruses by Star Internet. The
service is powered by MessageLabs. For more information on a proactive
anti-virus service working around the clock, around the globe, visit:
http://www.star.net.uk





Re: [SC-L] ACM Queue article and security education

2004-07-02 Thread ljknews
At 1:02 PM -0700 7/1/04, Blue Boar wrote:
>ljknews wrote:
>> I think it will be properly considered when the most strict portion
>> of the software world is using language X.   I have used many
>> programs where the flaws in the program make it clear that I care not
>> one whit about whether the authors of that program have opinion about
>> anything I might use. They are simply not competent, either as
>> individuals or else as an organization.
>
> By "most strict portion", do you mean people that care most about correct
> code, proofs, and such?

And organizations that hire the people you describe below to test the
software they build.

> I don't deny that the bulk of the heavy lifting will be done by people
> well-qualified to do so.  However, I'm of the school of thought that
> certain types of people who like to break things, and whose chief skill
> is breaking things, will always have a decent shot at finding a problem.
> There are people who couldn't build it, but they can sure break it.

> You don't typically get their attention until something is really,
> really popular.

Lots of people bring their attention to issues they are paid to test.
-- 
Larry Kilgallen




[SC-L] Education and security -- another perspective (was "ACM Queue - Content")

2004-07-02 Thread Wall, Kevin
Kenneth R. van Wyk wrote...

> FYI, there's an ACM Queue issue out that focuses on security -- see 
> http://acmqueue.com/modules.php?name=Content&pa=list_pages_issues&issue_id=14
>
> Two articles there that should be of interest to SC-L readers include
> Marcus Ranum's "Security: The root of the problem -- Why is it we can't
> seem to produce secure, high quality code?"  ..

I've been thinking alot about some of the statements that Marcus Ranum
made in his most recent article in the _ACM Queue_ (Vol 2, No 4)...
even before Ken invited us all to comment on it.

I mostly agree with Ranum's conclusions, although perhaps for
different reasons.

Ranum states:
"It's clear to me that we're:
 + Trying to teach programmers how to write more secure code
 + Failing miserably at the task"

He goes on to say that "it [educational approach] flat out hasn't
worked".

In general, I don't think this is an issue that is unique to _secure_
programming (coding, design, etc.). I think over the past 40 years or
so, as a discipline, we've failed rather miserably at teaching
programming, period. For the past 25 years, I've worked closely with
both highly educated Ph.D. computer scientists and with those whose
formal CS education consisted of at most a course or two in something
like C or Pascal. In many of these cases, the less educated are
beating out those who have had more formal education. (In fact,
I'd say this has been true in at least as many as 50% of the cases.)

What makes the difference? Well, it goes beyond mere aptitude and
general intelligence. I think in part at least, it goes with having
a passion for what you do. To some, doing design and coding and
other creative aspects is an artistic expression, a noble cause
and they would do it even if there weren't paid for its--witness
the open source movement which is largely funded by volunteer
labor. Others see it as a "job" or a "career path", but not much
more. In my 25 year observation, those with this PASSION almost
always "get it", and those without it are usually left behind
after the first few years into the profession.

I think that the same can be said for "secure coding / design".
Not only do those people have a passion for coding / design, but
the ones who seem to "get it" are the ones who have a passion for
security as well.

Okay, so probably no surprise here, right? Do what you enjoy and
you'll excel at it more often than ones who do it out of other
motives (no matter how noble--such as making an affordable living to
provide for your family).

So I agree with Ranum in a sense--that educational approaches to
security have overall failed, but I think it is not because the
educational process / system per se has failed us (not that I'm
arguing that it couldn't be improved), but because we haven't been
able to ignite the passion for security in others. (And frankly,
I'm not even to what degree that's possible. I'll leave that to
another discussion.)

In the past two years, I've had the fortune to teach a computer
security course that I had the major part in organizing / developing.
I have learned two things about the students during that time:
1) All the students do well when it comes to rote
   memorization. (E.g., questions such as "What cipher mode
   doesn't require an Initialization Vector?", etc.)
2) Only the students that seem to "get it" seem to do well
   on the questions requiring thought (i.e., ones requiring
   reasoning "outside the box").

Surprisingly (at least at first), I have often been discovered that
those who other faculty members often consider the brightest students
are ones who do the worst on the "questions requiring thought".

But in general, by the end of the 12 week period, I usually can tell
who is going to take and try to apply what they learned and those
who just chalk up the course as another 3 credit hours.

I see what I think is a related phenomena in the commercial world
as well. I've worked with a lot of developers who have worked on
security-related software (e.g., firewalls, crypto, secure proxies,
authentication and access control systems, etc.). One would EXPECT
that the groups that work on these projects would as a whole do
better at developing secure programs than the IT industry as a whole.
But overall, I don't think that their batting average is all that
much higher than the industry at large. We often hear excuses for
this ("security software is more complex", etc.), but I'm not buying
it. If anything, it's this observation more than anything else that
makes me think that formal education is not THE answer (although,
I do think it is part of the answer).

On a related note to security and education, I was wondering if anyone
knows of any experimental data that shows that those with formal
education in security develop more secure programs than those
who have never had such formal training?  If no such experimental
data exists, why not? Can 

Re: [SC-L] Education and security -- another perspective (was "ACM Queue - Content")

2004-07-02 Thread der Mouse
> In general, I don't think this is an issue that is unique to _secure_
> programming (coding, design, etc.).  I think over the past 40 years
> or so, as a discipline, we've failed rather miserably at teaching
> programming, period.

Right.  But on the other hand, that's not surprising - when did you
last see even a course, never mind a program, in academia that was
_supposed_ to teach programming (as opposed to computer science or
software engineering or any of the various other things usually taught
instead of it)?

I have a fuzzy memory that says that such courses exist now.  But only
a few of them and only very recently.

/~\ The ASCII   der Mouse
\ / Ribbon Campaign
 X  Against HTML   [EMAIL PROTECTED]
/ \ Email!   7D C8 61 52 5D E7 2D 39  4E F1 31 3E E8 B3 27 4B




Re: [SC-L] ACM Queue article and security education

2004-07-02 Thread Blue Boar
Peter Amey wrote:
I'm not entirely sure I follow this.  I _think_ you are saying:
"since we can't be sure that X is perfect (because it might have 5
remaining flaws) then there is no point in adopting it".  You seem to
be saying that it doesn't matter if X is _demonstrably_much_better_
than Y, if it is not perfect then don't change.  Have I got that
right?
No.  I was claiming that languages that allow for safety and verifiction 
can't neccessarily be trusted 100%.  There will always be a last few 
bugs.  As I said in my note that you replied to, that doesn't 
neccessarily mean you don't use it.  The other part of my note had to do 
with the last few bugs not coming to light until *everyone* is using 
that language.  Also not a reason to not go ahead and use it now, since 
the sooner the world starts to switch, the sooner you kill the last few 
bugs.

I think you were reacting to the one sarcastic part of my note, which 
essentially says "good luck getting the world to switch."

BB



[SC-L] Re: ACM Queue article and security education

2004-07-02 Thread David A. Wheeler
On 29 June 2004, James Walden said:
Subject: [SC-L] ACM Queue article and security education
I'd like to open a discussion based on this quote from Marcus Ranum's ACM Queue article entitled 
"Security: The root of the problem":
"We're stuck in an endless loop on the education concept.
We've been trying to 
educate programmers about writing secure code for at least a decade and it 
flat-out hasn't worked. While I'm the first to agree that beating one's head 
against the wall shows dedication, I am starting to wonder if we've chosen the wrong wall. What's Plan B?"

From my perspective, security education is only beginning to climb an initial 
upward curve.  While classes in security topics are becoming more common in 
undergraduate computer science course catalogs, their presence is far from 
universal.  I don't know of any university that requires such a class for an 
undergraduate CS degree; if any such programs exist, they're not common.

I agree with James Walden.  Very, very few schools educate on
how to develop secure applications.  I'll be teaching a course on
developing secure programs at George Mason University (GMU)
this fall 2004; GMU is one of the VERY FEW schools where
such a course is even OFFERED, never mind being REQUIRED.
I challenge Ranum to prove that most schools require education
on secure software development before graduation at the
undergraduate level. We haven't been "beating our heads to educate
programmers".  We HAVE been beating our heads against an
uncooperative academic environment that is unwilling to educate
how to develop secure programs.  I've been working for years
trying to convince the SWEBOK to add security as an important
software engineering topic; last I heard, they had still failed to
include security as a topic worth knowing about in software engineering.
I believe we need to get to the point where
computer science & software engineering schools LOSE ACCREDITATION
for failing to educate how to develop secure systems at the
undergraduate level.  We aren't anywhere near there today.
Few people need to write quicksorts, or operating system kernels,
but that's where we spend too many of our precious educational hours.
EVERYONE is now writing code that connects the Internet or an
intranet, through which their programs will be attacked.
We are ALL writing programs that must be secured.  And almost all
schools fail to spend even a single minute on educating what
that means.  So let's make sure they
get an education on what's actually more important.
Even a few hours would be FAR MORE than what almost all
schools do today.
I completely agree with Ranum that tools are a very helpful part of the
arsenal.  Turn on warnings! Make the defaults safe!
Build warnings/counters into the development system!
I give away a security scanning tool (flawfinder)
so that developers can search for probable vulnerabilities,
and I've often worked to get safer functions added to various
libraries so that things are "secure by default."
But that's not enough: A fool with a tool is still a fool.
There is no tool that can counter all foolish decisions,
and a developer can usually override a tool, so a developer
MUST understand WHY the tool is doing what it's doing.
Not using C/C++ to write new apps isn't enough.  No tool can
prevent ALL foolish decisions; you can specify a foolish
decision, and then formally prove that you did it.
Failing to verify input sufficiently can happen in any language.
And many "safe" languages have ways to override their safety, because
there are times when you need to override. C# has an "unsafe"
mode that allows buffer overflows; Ada has modes to
remove its protections too; and anybody can call out to C/C++
libraries.   A developer who doesn't understand anything may
just choose to override their tool, because the reasons for the limitations
haven't been explained to them.
Tools are great.  But only if we educate our developers
sufficiently so they'll know how to use the tools, their
limitations, and the risks they take when overriding them.
--- David A. Wheeler




[SC-L] Best practices training

2004-07-02 Thread Gary McGraw
Hi all,

Some of you may be interested in a Tutorial on software security best
practices that I will be giving at Usenix security this year.  More
information can be found here:
http://www.usenix.org/events/sec04/training/

See you in San Diego in August.

gem



This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.





[SC-L] Risk Analysis: Building Security In #3

2004-07-02 Thread Gary McGraw
Hi all,

The third article in my IEEE Security & Privacy magazine series called
"Building Security In" is on Risk Analysis in Software Design.  This
article was co-authored by Denis Verdon of Fidelity National.  As a
service to the community, we're making advance copies available here:

http://www.cigital.com/papers/download/risk-analysis.pdf

I am sure many of you already subscribe to S&P.  If you don't yet, you
should...check out http://www.computer.org/security/.  

gem




This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.