These aren't just secure coding related issues. They are general Software 
Quality Issues, that
should be - but aren't
- normal everyday practice. Partly due to lack of knowledge in the Developers, 
partly due to
time/budget constraints
of managers and partly due to apathy amongst Stake Holders/end 
users/customers...
Until all parties involved take the matter of Software Quality (and by 
extension, software
security) seriously, these
problems will continue.
Unfortunately, software in general has developed an extremely bad reputation 
for reliability, to
the extent that it is
an expect norm.
So long as consumers put up with poor quality whilst still paying top dollar 
for "pretty icons"
etc, their isn't much
chance for change...

What is needed is something similar to what happened in the motor car industry 
some years ago.
Cheaper, but better
quality and more reliable Japanese imports started to make their mark. The rest 
of the competition
soon had to fix
their the reliability of their own cars to remain competitive.

At this time, there is little competitive advantage to be gained from improving 
quality of software.

ys

----- Original Message -----
From: "David Crocker" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Subject: RE: [SC-L] Secured Coding
Date: Sat, 13 Nov 2004 23:35:13 -0000

>
> George (Greenarrow) wrote:
>
> >>I truly believe this as no matter how secured we make our programs there
> will always be someone to figure how to break it.<<
>
> That may be so; but the simple fact is, most of the problem software we have
> today is nothing like bulletproof.
>
> When I first learned programming (about 30 years ago), I took it as axiomatic
> that a program should be robust with respect to *all possible* external 
> inputs.
> Unfortunately, most developers don't seem to care about this - they are
> satisfied as soon as the program behaves correctly with respect to *valid*
> inputs. So the first thing to is to educate developers into understanding 
> that a
> program that is not robust with respect to all inputs is not a finished 
> program.
>
> Second, tools for mathematically specifying and verifying programs (<plug> -
> such as our own - </plug>) should be used more widely. Even if you have the 
> goal
> of making a program robust, it is easy to make mistakes. Recently I wrote a
> program which, as its input mechanism, parsed simple English sentences. I was
> aiming to make the program robust; but I missed a way in which a sentence 
> could
> be malformed so as to cause unexpected results. I would never have thought of
> testing for that sentence pattern (maybe random testing would have found it);
> but I was using automated formal verification, and the tool identified that
> pattern as a problem. It is relatively simple to use formal semantics to 
> specify
> "for all possible inputs X, property Y is true". By using such semantics 
> coupled
> with tools and methods to verify them, large classes of attacks (e.g. buffer
> overflows, SQL injection etc.) can be completely eliminated.
>
> Thirdly, developers need to be aware of basic concepts of security and how to
> design architectures to facilitate security. Furthermore, even when using 
> formal
> tools, developers need to know what are the important security properties. For
> example, some phishing attacks used a vulnerability in some browsers that 
> meant
> it was possible to make a browser visit one site while displaying another in 
> the
> address bar. Such a problem is easy to guard against using tool-supported 
> formal
> techniques - but someone has to identify the requirement (i.e. address bar
> always displays the address of the current page) in the first place. I think
> that a security module should be a compulsory part of all programming and
> computer science degree courses.
>
> Taken together, these changes would result in software that would resist the
> vast majority of attacks. It may not always be possible to make software
> bulletproof, but that is not an excuse for the appalling insecurity of much of
> today's software.

Reply via email to