George (Greenarrow) wrote: >>I truly believe this as no matter how secured we make our programs there will always be someone to figure how to break it.<<
That may be so; but the simple fact is, most of the problem software we have today is nothing like bulletproof. When I first learned programming (about 30 years ago), I took it as axiomatic that a program should be robust with respect to *all possible* external inputs. Unfortunately, most developers don't seem to care about this - they are satisfied as soon as the program behaves correctly with respect to *valid* inputs. So the first thing to is to educate developers into understanding that a program that is not robust with respect to all inputs is not a finished program. Second, tools for mathematically specifying and verifying programs (<plug> - such as our own - </plug>) should be used more widely. Even if you have the goal of making a program robust, it is easy to make mistakes. Recently I wrote a program which, as its input mechanism, parsed simple English sentences. I was aiming to make the program robust; but I missed a way in which a sentence could be malformed so as to cause unexpected results. I would never have thought of testing for that sentence pattern (maybe random testing would have found it); but I was using automated formal verification, and the tool identified that pattern as a problem. It is relatively simple to use formal semantics to specify "for all possible inputs X, property Y is true". By using such semantics coupled with tools and methods to verify them, large classes of attacks (e.g. buffer overflows, SQL injection etc.) can be completely eliminated. Thirdly, developers need to be aware of basic concepts of security and how to design architectures to facilitate security. Furthermore, even when using formal tools, developers need to know what are the important security properties. For example, some phishing attacks used a vulnerability in some browsers that meant it was possible to make a browser visit one site while displaying another in the address bar. Such a problem is easy to guard against using tool-supported formal techniques - but someone has to identify the requirement (i.e. address bar always displays the address of the current page) in the first place. I think that a security module should be a compulsory part of all programming and computer science degree courses. Taken together, these changes would result in software that would resist the vast majority of attacks. It may not always be possible to make software bulletproof, but that is not an excuse for the appalling insecurity of much of today's software. David Crocker, Escher Technologies Ltd. Consultancy, contracting and tools for dependable software development www.eschertech.com