Darren New wrote:
Christopher Smith wrote:
Yeah, 'cause we've been able to come up with a language where the
compiler can prove that everything is correct.
No. I'm simply referring to warnings that are warnings because the
compiler knows you did something wrong and it *will* come back to bite
you, except it doesn't know how you want to fix them. Like "Warning:
You have a virtual methods but no virtual destructor." There's an
error in your code: Either you shouldn't be declaring anything
virtual, or you need the virtual destructor to avoid corrupting
memory. Why is this a warning and not an error?
Actually, the real problem is if you have a destructor and the parent
isn't virtual *and* the object is cast to the parent's type when the
destructor is invoked (and even then, it is possible, though unlikely,
that all is just fine). If you insist on always having the virtual
destructor there you waste a lot of memory and performance, not to
mention a host of other issues. Warnings are warnings because to the
compiler they *look* wrong, but they might in fact be just fine.
I remember writing an assembler in Ada for school. I had to work so hard
just to convince the compiler that what any human could see was okay was
actually okay. Indeed, this tends to be a problem of static type systems
as well.
Which brings up a fine point: most of the dynamically typed languages
that you are promoting allow for all kinds of horrible scenarios that
they never catch until it is too late, simply by virtue of not bothering
to do type analysis at compile time. Is that really such a bad thing?
What about Java compilers that don't catch double-check locking or other
concurrency problems like deadlocks or data races? Hell, C# basically
takes the gloves off for native code.
Most languages don't catch any number of problems that can easily be
expressed in the language. Interestingly, of the more popular ones, I'd
argue C++ actually does a better job of giving the programmer an idea of
when they are doing something wrong and to design libraries/frameworks
where new user defined behavior can also detect errors (C++0X's Concept
Checking is going to make this much easier too).
Yes, and a reason why most of them are implemented in C or C++. ;-)
You mean, because lots of modern CPUs have a decent fit with C's
memory model? Note that where the fit is poor, other languages aren't
(or can't be) implemented in C.
It's pretty hard for it to be impossible to implement a language in any
of what we call "turing complete" languages. Of course, some are a lot
better fit than others.
--Chris
--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-lpsg