> Hi.   Someone recently passed along your essay (I don't subscribe to 
> the firewalls list).  There were a couple of comments I wanted to 
> make.


Howdy Mr. Spafford.  I appreciate your comments and acknowledge your
expertise and experience in these matters.  Thank you, you honor me sir.


> 
> 1) You quoted Ian Goldberg's 1995 article where he stated that buffer 
> overflows were "pretty new" in 1988.   This is not true.   Buffer 
> overflows were used to compromise security on systems in the 1960s 
> and 70s.   An early paper explained how Robert Morris's father  broke 
> into an early version of Unix by overflowing the password buffer in 
> the login program many years before 1988 (I'm sure the younger Robert 
> was familiar with that paper, too).   Many earlier papers also 
> described buffer overflows.

Actually, I cited the Trusecure Publications Information Security October
2001 issue <specific article titled "Chief Cypherpunk"> who lead the
interview and guided Mr. Ian Goldberg in citing his paper.  I'm sorry I
did not make this clear in my paper.  Yet, I do not argue that by and
prior to 1988 that buffer overflows where a fairly well known area of
concern in host level system security and an avenue of privilege
elevation.

> 
> Unfortunately, we have a lot of people who are working in security 
> with various levels of claimed expertise who have little or no 
> knowledge of the history  or underlying principles of what they are 
> doing.  (And no, that is not intended to make any suggestion about 
> Mr. Goldberg -- I do not know him, nor do I know his background. 
> I'm reacting to the quote and my knowledge of other "experts" in the 
> field.)
> 

Certainly, I also wish to convey that my intention was not to of
demeaning your expertise, history, nor knowledge in the security realm. If
offense was taken, I do apologize.

> 2) The comments I wrote in 1988 applied to the Internet arena of 
> 1988.    There were no significant viruses, worms, root kits, or the 
> like.   There was no WWW.   There was no history of widespread 
> computer abuse.  The majority of systems were running a Unix variant. 
> Pretty much every system administrator of the time had a college 
> degree, usually in computing or a related science.
> 
> That was the context of my comments at that time that it was not 
> appropriate to blame the administrators for what happened.   I still 
> believe that, in that context.  I don't believe it was appropriate to 
> blame the OS authors, either, although there was some responsibility 
> that they bore for their sloppy coding.
> 

Yet, the fact that overflows were fairly common knowledge and the attack
vectors the worm used to enter systems and spread from one system/network
to another does not genuinely allow any of the admins of the time to hide
in the guise of stupidity or cluelessness, especially in light of their
education and learning levels in this light.  Sadly, and this was a shame,
it appears it took an event of this significance to raise eyebrows in the
systems security realm.  


> Now, if we fast-forward to today's computing arena.   There are about 
> 65,000 viruses and worms (with over 95% of them for Microsoft 
> products).   There are literally hundreds of rootkits, DOS kits, and 
> break-in tools available on the net.   The WWW reaches hundreds of 
> millions of people.  We have a decade+ history of significant, public 
> break-ins.   The majority of systems in the world are running a very 
> buggy, bloated OS descended from a standalone PC monitor program. 
> Typical system administrators (and many security administrators) have 
> no training in computing, let alone security.
> 
> If the Morris worm were to occur today -- and, as you noted variants 
> have been occurring in the guise of CodeRed, et al. -- I would place 
> a large amount of blame with the vendors for doing a shoddy job of 
> producing safer software, and a significant amount of blame on the 
> administrators of the affected sites for not taking better 
> precautions in a known dangerous environment.
> 
> But in both cases, the primary blame goes to the people who produce 
> and employ malware.   There is no excuse for doing this, and they are 
> quite obviously the primary cause of the damage.


Agreed, Marcus Ranum, on the firewall wizards list, where this paper
generated quite a bit of discussion and  a number of side threads
acknowledged there is plenty of blame and responsibility to go around.
The point of my paper, sadly was to highlight the fact that for the large
part, security in the IT industry has never really progressed beyond the
point of "raised eyebrows", thus the constant circular nature of attack
and reattack of the same weaknesses and vectors that existed at least as
far back as 1988.

It's interesting to note that today, again, Trusesecure Corp's weekly
SECURITY WIRE DIGEST, (VOL. 4, NO. 3, JANUARY 14, 2002) noted how
Micorsoft is attempting to stumble forward in in some regard on the poor
coding issues prevalent in todays top desktop applications and their OS's:

<quote>
*MICROSOFT GIVES THE "BOOT" TO 20,000 DEVELOPERS
Microsoft is sending all 20,000 of its Windows developers to a one-day
security boot camp soon, in the hope that the compulsory training will
prompt its coders to pay closer attention to security issues, according to
a published report. No other details of the program were immediately
available. The employee initiative follows a year of embarrassing attacks
on vulnerable Microsoft products, including IIS Web server holes that
helped launch the successful spread of the Code Red and Nimda worms. Most
recently, the Universal Plug and Play in the new XP OS was found to have
serious flaws.
</quote>

> 
> However, I agree with you that we need to re-evaluate the culpability 
> of the software authors, the vendors, and the administrators.    I 
> have been making exactly this point in presentations and classes for 
> at least the last half-dozen years.   It hasn't been well-received in 
> too many venues until very recently.
> 
> 3) Your example of the arson victim isn't quite right.   In most 
> cases, an arson victim is not criminally liable unless she did 
> something stupid and criminal to deserve it (e.g., she chained some 
> fire escapes shut).   Instead, the victim may not get full payment 
> from an insurance policy, and that is the penalty for not keeping 
> current with the necessary protections.   This is similar to what 
> happens when your car is stolen -- you are not charged in criminal 
> court if you left the key in the ignition, but you may not get the 
> full payment for the car from your insurance company, or your future 
> premiums could be doubled.
> 

I maybe wrong, not being the lawyerly type, but, have not some "arson
victim's" been held culpable for injury to employees and clients working
in buildings they owned an operated businesses, at least in civil courts
for chained fire escapes and such resulting in deaths?

And in the same light, at least in Minnesota and North Carolina, it is
criminal negligence to leave the keys in a car, especially a running car,
which is a common occurrence in Minnesota as people warm their vehicles on
those -10 to -30F days.  It's at the least a ticketable offense and can
certainly also at least result i the cars being towed.  Issues of due
diligence spread farther then merely the IT realm...

> Imagine Joe Clueless is running a Windows box with no patches and no 
> firewall, has no training in security, and still hooks his system up 
> to the network.   If his system is hacked (and it will be, perhaps in 
> a matter of hours), he is still a victim.   Whoever breaks into his 
> system, or whoever authored the virus that corrupts his disk, that is 
> the person who committed the crime and should be prosecuted.
> 
> But is Joe blameless?   Under  law in most western nations, he is 
> probably not criminally liable.   He may be stupid, but that isn't a 
> crime.   He may be naive, but that isn't a crime either.   If he has 
> insurance, he may not get a full (or any) payout.  Or if has no 
> insurance, he pays another kind of penalty -- he loses his data.   So 
> he does pay a price.    And if Joe has a good lawyer who is 
> persistent and can convince a jury that the vendor was negligent, 
> then maybe the vendor will pay, too.
> 
> A better scenario would be for "hack" insurance to begin to become a 
> standard business practice.   Once the actuarial data comes in, the 
> companies set a standard premium.   They may give a discount of 30% 
> if there is a firewall, a 15% discount if the system is based on 
> FreeBSD+Apache, and a 75% discount if the security administrator has 
> a CS degree from Purdue. :-)    Meanwhile, the same company may set a 
> 25% penalty (extra premium) if the system is Windows-based, a 200% 
> penalty if it is running IIS, and there is a clause that there is no 
> payout unless there is evidence that all patches were present and 
> timely.     Under this kind of scenario, market pressures would tend 
> to lead to better practices by the vendors *and* the users.   That 
> would be a better solution than the government regulation you 
> suggest, although I am not hopeful it will happen any time soon.
> 

This assumes we have an educated and intelligent market of consumers,
which seems to be not the case, as software of various version releases is
still consumed with known vulnerabilities never being being corrected and
dealt with <e.g. IIS>.  This has often resulted in government regulation
to protect ourselves from our own stupidity of decision making and buying
strategy.  Barring that, it would require adopting international
standards on the absolute minimum requirements a system needs to get
connected to the dangerous place the Internet has become.


> 
> Your might find this of interest: 
> <http://www.cstb.org/web/pub_cybersecurity>.  And here are some 
> comments I have made before Congress about the shortage of security 
> professionals: 
> <http://www.cerias.purdue.edu/homes/spaf/misc/edu.pdf> (1996) and 
> <http://www.cerias.purdue.edu/homes/spaf/house01.pdf> (2001).
> 

Thank you sir, I will certainly review and add these to my personal and
public bookmarks!



Thanks,




Ron DuFresne
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Cutting the space budget really restores my faith in humanity.  It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation." -- Johnny Hart
        ***testing, only testing, and damn good at it too!***

OK, so you're a Ph.D.  Just don't touch anything.



_______________________________________________
Firewalls mailing list
[EMAIL PROTECTED]
http://lists.gnac.net/mailman/listinfo/firewalls

Reply via email to