Am 10.04.2013 19:06, schrieb Manu:
On 11 April 2013 02:59, Manu <[email protected]
<mailto:[email protected]>> wrote:

    In time, programmers will learn to be cautious/paranoid, and 'final'
    will dominate your code window.


Or more realistically, most programmers will continue to be oblivious,
and we'll enjoy another eternity of the same old problem where many 3rd
party libraries written on a PC are unusable on resource-limited
machines, and people like me will waste innumerable more hours
re-inventing wheels in-house, because the programmer of a closed-source
library either didn't know, or didn't give a shit.

None of it would be a problem if he just had to type virtual when he
meant it... the action would even assist in invoking conscious thought
about whether that's actually what he wants to do, or if there's a
better design.
</okay, really end rant>


Manu, maybe something you might not be aware:

- Smalltalk
- Eiffel
- Lisp
- Java
- Self
- Dylan
- Julia
- Objective-C
- JavaScript

Are just a few examples of languages with virtual semantics for method call. Some of those only offer virtual dispatch actually.

Some of them were developed in an age of computer systems that would make today's embedded systems look like High Performance Computing servers.

Julia is actually a new kid on block, hardly one year old, and already achieves C parity in many benchmarks.

So I think how much could be a problem of D's compilers and not the virtual by default concept in itself.


--
Paulo

Reply via email to