Re: [SC-L] ACM Queue article and security education

2004-07-02 Thread Blue Boar
Peter Amey wrote:
I'm not entirely sure I follow this.  I _think_ you are saying:
"since we can't be sure that X is perfect (because it might have 5
remaining flaws) then there is no point in adopting it".  You seem to
be saying that it doesn't matter if X is _demonstrably_much_better_
than Y, if it is not perfect then don't change.  Have I got that
right?
No.  I was claiming that languages that allow for safety and verifiction 
can't neccessarily be trusted 100%.  There will always be a last few 
bugs.  As I said in my note that you replied to, that doesn't 
neccessarily mean you don't use it.  The other part of my note had to do 
with the last few bugs not coming to light until *everyone* is using 
that language.  Also not a reason to not go ahead and use it now, since 
the sooner the world starts to switch, the sooner you kill the last few 
bugs.

I think you were reacting to the one sarcastic part of my note, which 
essentially says "good luck getting the world to switch."

BB



Re: [SC-L] ACM Queue article and security education

2004-07-02 Thread ljknews
At 1:02 PM -0700 7/1/04, Blue Boar wrote:
>ljknews wrote:
>> I think it will be properly considered when the most strict portion
>> of the software world is using language X.   I have used many
>> programs where the flaws in the program make it clear that I care not
>> one whit about whether the authors of that program have opinion about
>> anything I might use. They are simply not competent, either as
>> individuals or else as an organization.
>
> By "most strict portion", do you mean people that care most about correct
> code, proofs, and such?

And organizations that hire the people you describe below to test the
software they build.

> I don't deny that the bulk of the heavy lifting will be done by people
> well-qualified to do so.  However, I'm of the school of thought that
> certain types of people who like to break things, and whose chief skill
> is breaking things, will always have a decent shot at finding a problem.
> There are people who couldn't build it, but they can sure break it.

> You don't typically get their attention until something is really,
> really popular.

Lots of people bring their attention to issues they are paid to test.
-- 
Larry Kilgallen




RE: [SC-L] ACM Queue article and security education

2004-07-02 Thread Peter Amey


> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED]
> Behalf Of Blue Boar
> Sent: 01 July 2004 21:03
> To: ljknews
> Cc: [EMAIL PROTECTED]
> Subject: Re: [SC-L] ACM Queue article and security education
> 
> 
> ljknews wrote:
> > I think it will be properly considered when the most strict portion
> > of the software world is using language X.   I have used many
> > programs where the flaws in the program make it clear that 
> I care not
> > one whit about whether the authors of that program have 
> opinion about
> > anything I might use. They are simply not competent, either as
> > individuals or else as an organization.
> 
> By "most strict portion", do you mean people that care most about 
> correct code, proofs, and such?  I don't deny that the bulk 
> of the heavy 
> lifting will be done by people well-qualified to do so.  
> However, I'm of 
> the school of thought that certain types of people who like to break 
> things, and whose chief skill is breaking things, will always have a 
> decent shot at finding a problem.  There are people who 
> couldn't build 
> it, but they can sure break it.
> 
> You don't typically get their attention until something is really, 
> really popular.  So yes, you can write your stuff in Language X, and 
> assume it's secure.  It might not actually be until the whole 
> world has 
> had its way with Language X, but (hopefully) that's not a 
> problem.  You 
> can still do the dance of patching the last 5 problems in Language X, 
> and end up better off that if you'd just used C.
> 
> Even Knuth has to write checks ocassionally, and he does a 
> lot of proof 
> work, doesn't he?
> 
> So, if Language X only has 5 problems total, even if it takes 
> years to 
> ferret them out, butthey are fixable, please proceed with getting the 
> whole world to use Language X.
> 

I'm not entirely sure I follow this.  I _think_ you are saying: "since we can't be 
sure that X is perfect (because it might have 5 remaining flaws) then there is no 
point in adopting it".  You seem to be saying that it doesn't matter if X is 
_demonstrably_much_better_ than Y, if it is not perfect then don't change.  Have I got 
that right?

This is a variant on the Goedel gambit often used to attack formal verification.  It 
goes "since Goedel's Theorem places strict limits on what we can formalize and prove, 
let's not bother at all and just do a few unit tests instead".  It also reminds me of 
what I call the asprin analogy: "aspirin doesn't cure cancer so there's no point in 
taking it for headaches".

The reality is that demonstrable improvements in quality, safety and security can be 
achieved by using the right tools, languages and methods.  It is for those who choose 
_not_ to use the strongest engineering to justify their position not the other way 
round.


Peter


**
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.  The IT Department at Praxis Critical Systems can be contacted at 
[EMAIL PROTECTED]
This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.
www.mimesweeper.com
**



This e-mail has been scanned for all viruses by Star Internet. The
service is powered by MessageLabs. For more information on a proactive
anti-virus service working around the clock, around the globe, visit:
http://www.star.net.uk





RE: [SC-L] ACM Queue article and security education

2004-07-02 Thread Peter Amey


> -Original Message-
> From: Blue Boar [mailto:[EMAIL PROTECTED]
> Sent: 01 July 2004 17:11
> To: Peter Amey
> Cc: [EMAIL PROTECTED]
> Subject: Re: [SC-L] ACM Queue article and security education
> 
> 
> Peter Amey wrote:
> > There are languages which are more suitable for the construction of
> > high-integrity systems and have been for years.  We could have
> > adopted Modula-2 back in the 1980s, people could take the 
> blinkers of
> > prejudice off and look properly at Ada.  Yet we continue to use
> > C-derived languages with known weaknesses.
> 
> So we trade the known problems for a set of unknown ones?  

[snip]

A mindset that would have kept us building aircraft in wood!

In any case, we _do_ adopt new languages and methods, frequently.  In the time I have 
been using SPARK I have had people say: "that's neat, if only you could do it for X".  
Where X has been C, C++, Java and C# in that order and at about 5 year intervals.  
What we _don't_ do is make those choices based on any kind of objective assessment or 
engineering judgement.

[snip]

> Language X may very well be a much better starting point, I 
> don't know. 
>   I do believe that it will never be properly looked at until 
> the whole 
> world starts using it for everything, though.

And how will the whole world start using it if everyone waits for everyone else?

In any case, I don't expect the whole world to adopt any one method (any more than I 
build bicycles in carbon fibre even though that is the material of choice for, say, 
racing cars).   What I do expect is that principled engineers, the kind of people who 
care enough about their profession to contribute to groups like this, will seek to use 
the best and most appropriate technology for their purposes.  


Peter


**
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.  The IT Department at Praxis Critical Systems can be contacted at 
[EMAIL PROTECTED]
This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.
www.mimesweeper.com
**



This e-mail has been scanned for all viruses by Star Internet. The
service is powered by MessageLabs. For more information on a proactive
anti-virus service working around the clock, around the globe, visit:
http://www.star.net.uk





Re: [SC-L] ACM Queue article and security education

2004-07-01 Thread Blue Boar
ljknews wrote:
I think it will be properly considered when the most strict portion
of the software world is using language X.   I have used many
programs where the flaws in the program make it clear that I care not
one whit about whether the authors of that program have opinion about
anything I might use. They are simply not competent, either as
individuals or else as an organization.
By "most strict portion", do you mean people that care most about 
correct code, proofs, and such?  I don't deny that the bulk of the heavy 
lifting will be done by people well-qualified to do so.  However, I'm of 
the school of thought that certain types of people who like to break 
things, and whose chief skill is breaking things, will always have a 
decent shot at finding a problem.  There are people who couldn't build 
it, but they can sure break it.

You don't typically get their attention until something is really, 
really popular.  So yes, you can write your stuff in Language X, and 
assume it's secure.  It might not actually be until the whole world has 
had its way with Language X, but (hopefully) that's not a problem.  You 
can still do the dance of patching the last 5 problems in Language X, 
and end up better off that if you'd just used C.

Even Knuth has to write checks ocassionally, and he does a lot of proof 
work, doesn't he?

So, if Language X only has 5 problems total, even if it takes years to 
ferret them out, butthey are fixable, please proceed with getting the 
whole world to use Language X.

BB



Re: [SC-L] ACM Queue article and security education

2004-07-01 Thread ljknews
At 9:10 AM -0700 7/1/04, Blue Boar wrote:

>Language X may very well be a much better starting point, I don't know.  I do believe 
>that it will never be properly looked at until the whole world starts using it for 
>everything, though.

I think it will be properly considered when the most strict portion of the
software world is using language X.   I have used many programs where the
flaws in the program make it clear that I care not one whit about whether
the authors of that program have opinion about anything I might use. They
are simply not competent, either as individuals or else as an organization.
-- 
Larry Kilgallen




RE: [SC-L] ACM Queue article and security education

2004-07-01 Thread Michael S Hines
I can just see an OS go into a wait state now while the VM/.NET or whatever
does garbage collection; and the delays while the intermediate code is
turned into executable code by the loaders.   

Not!  

HLL have given us portability (witness - *nix) but at some price of
performance.  The HW development has outpaced SW development - to the tune
where we hardly notice the performance hit at all.  After all, now fast can
one person type (grin)?

It's always a trade off...   HW/SW.  

Mike Hines 
---
Michael S Hines
[EMAIL PROTECTED] 
-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Blue Boar
Sent: Thursday, July 01, 2004 11:11 AM
To: Peter Amey
Cc: [EMAIL PROTECTED]
Subject: Re: [SC-L] ACM Queue article and security education

Peter Amey wrote:
> There are languages which are more suitable for the construction of
> high-integrity systems and have been for years.  We could have
> adopted Modula-2 back in the 1980s, people could take the blinkers of
> prejudice off and look properly at Ada.  Yet we continue to use
> C-derived languages with known weaknesses.

So we trade the known problems for a set of unknown ones?  It might be 
appropriate to do so; C may be "broken" enough that it's better to go 
for an unknown with a design that allows for a possible correct 
implementation.  I keep thinking of Java, for example.  It's a good 
paper design for security purposes (I'll leave functionality alone for 
now.)  But there are still all the issues with the VM implementation and 
libraries to deal with.

Language X may very well be a much better starting point, I don't know. 
  I do believe that it will never be properly looked at until the whole 
world starts using it for everything, though.

BB





Re: [SC-L] ACM Queue article and security education

2004-07-01 Thread Blue Boar
Peter Amey wrote:
There are languages which are more suitable for the construction of
high-integrity systems and have been for years.  We could have
adopted Modula-2 back in the 1980s, people could take the blinkers of
prejudice off and look properly at Ada.  Yet we continue to use
C-derived languages with known weaknesses.
So we trade the known problems for a set of unknown ones?  It might be 
appropriate to do so; C may be "broken" enough that it's better to go 
for an unknown with a design that allows for a possible correct 
implementation.  I keep thinking of Java, for example.  It's a good 
paper design for security purposes (I'll leave functionality alone for 
now.)  But there are still all the issues with the VM implementation and 
libraries to deal with.

Language X may very well be a much better starting point, I don't know. 
 I do believe that it will never be properly looked at until the whole 
world starts using it for everything, though.

BB



RE: [SC-L] ACM Queue article and security education

2004-07-01 Thread Peter Amey


> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED]
> Behalf Of Michael S Hines
> Sent: 30 June 2004 17:00
> To: [EMAIL PROTECTED]
> Subject: RE: [SC-L] ACM Queue article and security education
> 
> 
> If the state of the art in automobile design had progressed 
> as fast as the
> state of the art of secure programming - we'd all still be 
> driving Model
> T's.  
> 
> Consider-
>   - System Development Methods have not solved the (security) 
> problem -
> though we've certainly gone through lots of them.
>   - Languages have not solved the (security) problem - though we've
> certainly gone through (and continue to go through) lots of them.
>   - Module/Program/System testing has not solved the 
> (security) problem -
> though there has been a plethorea written about system 
> testing (both white
> box and black box).
> 

I agree that we have not solved the problem by the above means but I think it should 
be said that this is due more to the refusal of our industry to make a serious attempt 
to use them than because they have used them and failed.  The reality is that most 
system development still uses ad hoc, informal approaches and inherently insecure and 
ambiguous implementation languages.  Testing cannot make up for these deficiencies 
because of fundamental limitations of coverage etc. see [1,2,3].

There are development approaches, such as the use of formal methods, which have a 
proven track record of success.  They are still being used successfully today.  Yet 
most of our industry is either unaware of them, regards formal methods as some 
academic failure from the 1970s or demands "evidence" (although they never seem to 
need evidence before adopting the latest fashionable fad).

There are languages which are more suitable for the construction of high-integrity 
systems and have been for years.  We could have adopted Modula-2 back in the 1980s, 
people could take the blinkers of prejudice off and look properly at Ada.  Yet we 
continue to use C-derived languages with known weaknesses.  All we hear are appeals to 
better training, tools to help find the stupidities the poor development approaches 
make inevitable and pious hopes that future developments in computer science will 
rescue us.  What we really need is to use the good stuff we already have.

Just to back this random polemic up a bit:  we have just delivered a secure system to 
an important customer which has been independently evaluated to a high level.  Zero 
defects were found.  It was cheaper than the system it replaces.  (Draconian NDAs 
limit what can be said to that unfortunately).  The system was formally specified in 
Z, coded in SPARK and proofs carried out to ensure that it was wholly free from 
run-time errors (and hence from attacks such as buffer overflow) and that essential 
security invariants are maintained.  None of this is bleeding edge technology.  Z has 
been around for ages, SPARK for 14 years.  



regards



Peter


1. Littlewood, Bev; and Strigini, Lorenzo: Validation of Ultrahigh Dependability for 
Software-
Based Systems. CACM 36(11): 69-80 (1993)
2. Butler, Ricky W.; and Finelli, George B.: The Infeasibility of Quantifying the 
Reliability of
Life-Critical Real-Time Software. IEEE Transactions on Software Engineering, vol. 19, 
no.
1, Jan. 1993, pp 3-12.
3. Littlewood, B: Limits to evaluation of software dependability. In Software 
Reliability and
Metrics (Proceedings of Seventh Annual CSR Conference, Garmisch-Partenkirchen). N.
Fenton and B. Littlewood. Eds. Elsevier, London, pp. 81-110.



**
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.  The IT Department at Praxis Critical Systems can be contacted at 
[EMAIL PROTECTED]
This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.
www.mimesweeper.com
**



This e-mail has been scanned for all viruses by Star Internet. The
service is powered by MessageLabs. For more information on a proactive
anti-virus service working around the clock, around the globe, visit:
http://www.star.net.uk





RE: [SC-L] ACM Queue article and security education

2004-07-01 Thread Michael Canty
I tend to wonder if I missed something along the way.
When I left the friendly confines of school back in '84 and entered the
wonderful world of "do or die" I was handed 2 sets of listings.  One was only 8
inches high, the other was slightly over 15.  Those were my 2 new systems and
they were written in 370 assembler and I had one mandate.  Don't you ever break
anything that will break the system.
>From day 1 I knew that I only ran in key zero, supervisor state for the absolute
minimum time I had to.  I knew to isolate actions that were cross-address space,
I knew to be incredibly careful what I did in the link-pack area.  Be careful
w/CSA.  Understand how multi-engine machines treat storage (I'd just love to
have a CDS instruction on a server...).  I still have code running at BLS that
intercepts every single OPEN/CLOSE SVC to decide if it is for a print management
system I wrote so that it can decide whether or not to pass the open on to a
real application or change it so it goes to QMS.  Point being that nobody had a
class for me to take to do that stuff.  You learn via your experiences and no
class will ever prepare you to insert yourself into the OS/390 SVC chain (well,
at least not successfully).
The architecture of the machine was such that something like a buffer overrun or
an XSS exploit just couldn't happen because not everybody was allowed to even
consider running in anything other than a protected problem state.  Ok, it could
happen.  You had to jump thru hoops to even be able to be dumb enough to get
around to writing the code that allowed it to happen.

I consistently see languages being blamed for security problems.  Sure, 'c' has
problems.  I wrote my first few lines of it back in 1991 and wound up having to
develop a full-blown file transfer mgmt system under unix while I was actually
learning the language & OS.  (talk about fun).  I did notice something while I
was doing it though.  Bellsouth was deploying a TCP/IP based network called
BOSIP for their first distributed applicaton (RNS) and we were doing the
middleware work.  The thing I found interesting was that security got "relaxed"
to integrate the unix boxes into the BSDN/VTAM network & mainframe environment.
Over the years I've seen that relaxation grow and I honestly don't think it is
because of a language or a network protocol.  I think it is simply that we have
come to accept the lowest level of security as the baseline.  Whether it be a
linux or softy workstation, neither is nearly as secure as a glass house
mainframe and never will be.

The security problems of today exist not because of a language.  They exist
because of the acceptance of the lack of an architectural doctrine that defined
the difference between what could be called a problem state and a supervisor
state.  Under UNIX I had to be root way too much and even when I tried to
isolate it I payed significant performance penalties.  Softy is even worse.

You can't teach around what is a fundamentally flawed architecture.  A language
can mask stupidity but what exactly do you accomplish if the programmer doesn't
wind up understanding the "why"?  I've dealt w/way too many CS grads who
couldn't spell kumputer if you spotted them the "c".  (and they got the degree
anyway).

So, my long-winded rambling pseudo point would be that instead of arguing
whether 'c' is evil or 'spark' is wonderful or whether there should be a
mandatory shock treatment to understand security that we really should be
looking at the underlying architecture.  The code isn't going to change.  (Most
source is lost anyway... trust me on that, I did some y2k stuff w/a crystal
ball).  The underlying architecture seems to be the key to me.

mjc





Re: [SC-L] ACM Queue article and security education

2004-06-30 Thread George Capehart
On Wednesday 30 June 2004 12:00, Michael S Hines allegedly wrote:



> And then a thought question - in message passing operating systems
> (those that respond to external stimuli, or internal message queues)
> - if one can inject messages into the processing queue, can't one in
> essence 'capture the flag'?

The short version of a very long answer is:  "It's certainly possible, 
but we've been securing message-based systems for a long time and 
understand the attacks and defenses.  Any well-designed message-based 
system includes controls that preserve the confidentiality, integrity 
and availability of the system.  Some even include audit trails, etc."

  Yet we see message passing systems as
> middleware (and OS core technology in some cases) to facilitate cross
> platform interfaces.  Aren't we introducing inherient security flaws
> in the process?

Yes.  See above.  Google for "CORBASec", "DCE Security Service," 
MQSecure.  Go to www.w3c.org, www.oasis-open.org, 
www.projectliberty.org, www.ws-i.org, etc. for the work that's being 
done on securing Web services.  Then go to http://citeseer.ist.psu.edu/ 
and search on terms like Kerberos, SSL, TLS, IPSec, etc.  Then, see 
_Applied_Cryptography_ and _Practical_Cryptography . . .

You are absolutely correct that, left unprotected, message passing 
systems are subject to *all* *sorts* of attacks.  The good news is that 
there are lots of very smart people working on securing them.

Cheers,

George Capehart
-- 
George W. Capehart

Key fingerprint:  3145 104D 9579 26DA DBC7  CDD0 9AE1 8C9C DD70 34EA

"With sufficient thrust, pigs fly just fine."  -- RFC 1925




RE: [SC-L] ACM Queue article and security education

2004-06-30 Thread Michael S Hines
If the state of the art in automobile design had progressed as fast as the
state of the art of secure programming - we'd all still be driving Model
T's.  

Consider-
  - System Development Methods have not solved the (security) problem -
though we've certainly gone through lots of them.
  - Languages have not solved the (security) problem - though we've
certainly gone through (and continue to go through) lots of them.
  - Module/Program/System testing has not solved the (security) problem -
though there has been a plethorea written about system testing (both white
box and black box).

And a question/comment/observation.
First the comment - As an IT Auditor we approach auditing in two stages -
first we look at general controls, and then application controls (won't go
into details here - there's information on this available elsewhere).  If
general controls are not in place, application controls are not relevant
(that is any application control can be circumvented due to weak general
controls). 
Then the question - Why do we not subject computer operating systems (which
are a general control) to the same level of testing that we subject
applications?   
And the observation - weaknesses in operting systems have been documented
(but not widely circulated) - yet we (as Sysadmins/users/auditors/security
experts - you pick) do not have a problem using faulty system software and
laying applications on top of them.  Why is that? 

And then a thought question - in message passing operating systems (those
that respond to external stimuli, or internal message queues) - if one can
inject messages into the processing queue, can't one in essence 'capture the
flag'?  Yet we see message passing systems as middleware (and OS core
technology in some cases) to facilitate cross platform interfaces.  Aren't
we introducing inherient security flaws in the process? 

Mike Hines
---
Michael S Hines
[EMAIL PROTECTED] 




Re: [SC-L] ACM Queue article and security education

2004-06-30 Thread James Walden
Kenneth R. van Wyk wrote:
Overall, I like and agree with much of what Marcus said in the article.  
I don't, however, believe that we can count on completely putting 
security "below the radar" for developers.  Having strong languages, 
compilers, and run-time environments that actively look out for and 
prevent common problems like buffer overruns are worthy goals, to be 
sure, but counting solely on them presumes that there are no security 
problems at the design, integration, or operations stages of the 
lifecycle.  Even if the run-time environment that Marcus advocates is 
_perfect_ in its protection, these other issues are still problematic 
and require the developers and operations staff to understand the problems.
I agree that you can't solve all security problems with development tools, but 
I think security tools are a worthwhile investment because deploying tools can 
be accomplished much more quickly than educating developers, tools can help 
experienced developers, and tools can raise awareness of software security 
issues.  The article's mention of people creating patches to eliminate compiler 
security warnings may indicate that I'm too optimistic about tools raising 
awareness, but I think that some developers will learn from their tools.

Yup, but in the "belt and suspenders" approach that I like to advocate, 
I'd like to see software security in our undergrad cirricula as well as 
professional training that helps developers understand the security 
touch points throughout the development process -- not just during the 
implementation phase.
I agree.  Students should see software security in all development phases 
relevant to each software course that they take; software engineering in 
particular should address security topics in all phases of the development 
process.  I think there's an additional need for a class focused purely on 
security to put all the elements of security together.

Peter G. Neumann wrote:
> Gee, Some of us have been saying that for 40 years.
I can't deny that even if I have only been reading your comp.risks digest for a 
little more than a third of that span, but I think the fact that today's 
security problems are directly and indirectly impacting large segments of the 
population has increased awareness of security problems, and, as a result, 
we're seeing a rise in security education.  Many of us like to think that 
computer science changes rapidly, and it does compared to older fields like 
physics, where you have to go to graduate school to study much that was 
developed after the 1930's, but I suspect most people in any field avoid change 
until it's forced upon them.

--
James Walden, Ph.D.
Visiting Assistant Professor of EECS
The University of Toledo @ LCCC
http://www.eecs.utoledo.edu/~jwalden/
[EMAIL PROTECTED]



Re: [SC-L] ACM Queue article and security education

2004-06-30 Thread Kenneth R. van Wyk
James Walden wrote:
I'd like to open a discussion based on this quote from Marcus Ranum's 
ACM Queue article entitled "Security: The root of the problem":
Thanks.  I also read Marcus's article with interest.  Caveat: clearly, I 
have a biased outlook, since software security training is one of the 
things that I do for a living.

Overall, I like and agree with much of what Marcus said in the article.  
I don't, however, believe that we can count on completely putting 
security "below the radar" for developers.  Having strong languages, 
compilers, and run-time environments that actively look out for and 
prevent common problems like buffer overruns are worthy goals, to be 
sure, but counting solely on them presumes that there are no security 
problems at the design, integration, or operations stages of the 
lifecycle.  Even if the run-time environment that Marcus advocates is 
_perfect_ in its protection, these other issues are still problematic 
and require the developers and operations staff to understand the problems.

From my perspective, security education is only beginning to climb an 
initial upward curve.  While classes in security topics are becoming 
more common in undergraduate computer science course catalogs, their 
presence is far from universal.  I don't know of any university that 
requires such a class for an undergraduate CS degree; if any such 
programs exist, they're not common.
I agree with you on this, certainly.  My nephew is a senior in an 
undergrad CS curriculum and his university has yet to discuss security 
in any of his course work, to my knowledge. 

While there are non-university classes and workshops that teach 
software security, I doubt that a majority of developers have attended 
even one such class.  Software security has to be integrated into the 
CS curriculum before we can expect a majority of developers to have 
the appropriate skills, and then there will still be the issue of 
applying them under deadline pressure.
Yup, but in the "belt and suspenders" approach that I like to advocate, 
I'd like to see software security in our undergrad curricula as well as 
professional training that helps developers understand the security 
touch points throughout the development process -- not just during the 
implementation phase.

Cheers,
Ken van Wyk
http://www.KRvW.com


Re: [SC-L] ACM Queue article and security education

2004-06-30 Thread Peter G. Neumann
Gee, Some of us have been saying that for 40 years.




Re: [SC-L] ACM Queue article and security education

2004-06-30 Thread ljknews
At 8:10 PM -0400 6/29/04, James Walden wrote:

>While there are non-university classes and workshops that teach software security, I 
>doubt that a majority of developers have attended even one such class.  Software 
>security has to be integrated into the CS curriculum before we can expect a majority 
>of developers to have the appropriate skills, and then there will still be the issue 
>of applying them under deadline pressure.
>
>That said, I agree with most of the article.  We can't wait for years to software 
>security to become a standard part of the curriculum, and most of his suggestions, 
>such as turning C compiler warnings into errors, are good ideas no matter what the 
>current status of security education.  I also second his enthusiasm for perl's taint 
>mode.

Teaching students how to avoid problems in C should be a separate (optional)
course.

Dealing with issues that have _not_ been solved in higher level languages
should be a required course not burdened by the baggage of C.

And whether something is a "warning" or an "error" is outside the scope
of the programming language itself and into the build process which would
allow completion in the face of warnings.