Again we see that a factual and independent index of security vulnerabilities is needed to objectively assess security in a complex environment.
I have proposed the Clairmont-Everhardt Index of Security Vulnerability to be that objective measurement. As published earlier, this index could give the Security Manager, CSO or other security personnel some objective measurement as to their actual Network security within a margin of error. Right now there is no known objective security measurements that are generally known. A paper that is forthcoming will propose as a start: the methodologies and objectives metrics that could be used in creating a credible Index of Vulnerability. The higher the Index of Vulnerability the lower the security of the overall computer network. Absolute security is never possible, as Six Sigma is the goal in manufacturing, these methodologies and goals should be used in securing networks on the internet. Any useful ideas and contributions would be greatly appreciated and acknowledged. I look forward to the ensuing discussions and subsequent diversionary humor that always results when a serious security discussion is broached on this rather mendacious forum of paladian mavericks.8-> Jan Clairmont Paladin of Security Firewall Administrator/Consultant -----Original Message----- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Nick FitzGerald Sent: Friday, June 11, 2004 9:05 PM To: [EMAIL PROTECTED]; [EMAIL PROTECTED]; [EMAIL PROTECTED] Subject: [Full-Disclosure] RE: SECURE SOCKETS LAYER COELACANTH: Phreak Phishing Expedition "Drew Copley" <[EMAIL PROTECTED]> reminded us: > As a addendum, perhaps, though I wouldn't doubt someone > might make some nice proof of concept code for this... > > A similiar issue of this kind was found in IE a few > years ago - remember of course - it is IE's fault that they > are not properly parsing this, regardless of what they need > to parse... so this is ultimately a Microsoft bug... > > http://groups.google.com/groups?q=group:bugtraq+dns+wildcard&hl=en&lr=&i > e=UTF-8&selm=bugtraq/Pine.BSF.4.20.0111142031560.527-100000%40alive.znep > .com&rnum=1 > > > I am quite surprised Microsoft did not properly fix this way back > then. Really? Surely you are just being polite? This is entirely consistent with a long line of shoddy "fixes" from Microsoft (and, to be fair, many other vendors). Instead of seeing the "%20 bug" reported by Slemko above for what it turns out it was -- a clear indication something was horribly broken in multiple parts of the codebase where (HTML) URL parsing occurs, it is now quite clear that it was seen as a "there is a problem if '%20' is present in URLs" problem. When "fixing" the %00/binary null issue recently, was _that_ seen for what it really was -- a clear indication there was something horribly broken in multiple parts of the codebase where (HTML) URL parsing occurs? _______________________________________________ Full-Disclosure - We believe in it. Charter: http://lists.netsys.com/full-disclosure-charter.html
