-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

> From: "E.B. Dreger" <[EMAIL PROTECTED]>
> 
> ML> No, it isn't, as is doing buf_t[x] rather than pointer
> 
> True.  I just like having a struct so I may pass a single
> variable in function calls instead of a whole mess of them.

The problem is not pointers, it is pointers pointing at things that don't belong to 
them or that they don't understand.
 
> Is it unreasonable to ask that programmers not assume memory is
> initialized, to check bounds as needed, and to realize that
> operations are NOT atomic without special protection[*]? 

40 years of experience says it is unreasonable to expect the programmer to get it 
right 100% of the time. 

A modern server or Desktop OS is measured in hundreds of millions of lines of code, 
what is an acceptable error rate per line of code?

Last time I looked the average C programmer was running at about 1 "error" every 10 
lines. Errors of this specific type in mature code probably running a lot less, but we 
are probably talking about ~100,000's of such issues in a typical Unix box.

- From a NANOG perspective it only really matters if they are in a network service, 
but from a cracking perspective any program you can feed, or get the user to feed 
corrupted input into will do fine.

> I don't think so.

Be assured it is. Programmers even the best make mistakes.

> Sure, it's extra work; put it in a library.

Precisely, most advocating change want an automated system, ideally something that 
lets us use all the many millions of lines of existing C code without too much 
aggrevation or performance hit. Most people here don't actually care "that" much if a 
specific program operates 100% correctly, only if it is subverted by worms or crackers 
to affect the integrity of the host system and/or network - in which case something 
like StackGuard is appropriate.

However fixes like StackGuard don't really remove the problem, they just mitigate the 
effects. My first experience of a big project in Java was such a relief to have a 
programming language where nearly all the errors either didn't compile, or were 
obviously and logically related to the bit that had just been altered.

This is way off topic for NANOG. I guess the lesson for NANOG that it is quite easy if 
you have the source code to mitigate the effects of these kind of problems, such that 
failure to patch will result in nothing more than a DoS at worst. Although these 
issues have been less of a problem on systems based on free (and open source) 
software, although I suspect the principal(*) reason for that is diversity in 
architecture and compiler meaning exploiting the weaknesses is harder, even if they 
are there.

 Simon

* A secondary reason could be in the e-mails from David Wheeler saying "fix this" (in 
the nicest possible way).
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQE+OQ5HGFXfHI9FVgYRAgENAKC4TqEqtYO9mOzC7GUFZL0yUXxkugCfd0as
cV4IA/+PHmi26hGwzPwWP5M=
=DJjK
-----END PGP SIGNATURE-----

Reply via email to