On Mon, 03 Jun 2013 23:39:25 -0400, Manu <[email protected]> wrote:
On 4 June 2013 12:50, Steven Schveighoffer <[email protected]> wrote:
On Mon, 03 Jun 2013 12:25:11 -0400, Manu <[email protected]> wrote:
You won't break every single method, they already went through that
recently when override was made a requirement.
It will only break the base declarations, which are far less numerous.
Coming off the sidelines:
1. I think in the general case, virtual by default is fine. In code
that
is not performance-critical, it's not a big deal to have virtual
functions,
and it's usually more useful to have them virtual. I've experienced
plenty
of times with C++ where I had to go back and 'virtualize' a function.
Any
time you change that, you must recompile everything, it's not a simple
change. It's painful either way. To me, this is simply a matter of
preference. I understand that it's difficult to go from virtual to
final,
but in practice, breakage happens rarely, and will be loud with the new
override requirements.
I agree that in the general case, it's 'fine', but I still don't see how
it's a significant advantage. I'm not sure what the loss is, but I can
see
clear benefits to being explicit from an API point of view about what is
safe to override, and implicitly, how the API is intended to be used.
Can you see my point about general correctness? How can a class be
correct
if everything can be overridden, but it wasn't designed for it, and
certainly never been tested?
Since when is that on the base class author? Doctor, I overrode this
class, and it doesn't work. Well, then don't override it :)
Also there is the possibility that a class that isn't designed from the
start to be overridden. But overriding one or two methods works, and has
no adverse effects. Then it is a happy accident. And it even enables
designs that take advantage of this default, like mock objects. I would
point out that in Objective-C, ALL methods are virtual, even class methods
and properties. It seems to work fine there.
What I'm really trying to say is, when final is the default, and you
really should have made some method virtual (but didn't), then you have to
pay for it later when you update the base class. When virtual is the
default, and you really wanted it to be final (but didn't do that), then
you have to pay for it later when you update the base class. There is no
way that is advantageous to *everyone*.
2. I think your background may bias your opinions :) We aren't all
working on making lightning fast bare-metal game code.
Of course it does. But what I'm trying to do is show the relative merits
of
one default vs the other. I may be biased, but I feel I've presented a
fair
few advantages to final-by-default, and I still don't know what the
advantages to virtual-by-default are, other than people who don't care
about the matter feel it's an inconvenience to type 'virtual:'. But that
inconvenience is going to be forced upon one party either way, so the
choice needs to be based on relative merits.
It's advantageous to a particular style of coding. If you know everything
is virtual by default, then you write code expecting that. Like mock
objects. Or extending a class simply to change one method, even when you
weren't expecting that to be part of the design originally.
I look at making methods final specifically for optimization. It doesn't
occur to me that the fact that it's overridable is a "leak" in the API,
it's at your own peril if you want to extend a class that I didn't intend
to be extendable. Like changing/upgrading engine parts in a car.
3. It sucks to have to finalize all but N methods. In other words, we
need a virtual *keyword* to go back to virtual-land. Then, one can put
final: at the top of the class declaration, and virtualize a few
methods.
This shouldn't be allowed for final classes though.
The thing that irks me about that is that most classes aren't base
classes,
and most methods are trivial accessors and properties... why cater to the
minority case?
I think it is unfair to say most classes are not base classes. This would
mean most classes are marked as final. I don't think they are. One of
the main reasons to use classes in the first place is for extendability.
Essentially, making virtual the default enables the *extender* to
determine whether it's a good base class, when the original author doesn't
care.
I think classes fall into 3 categories:
1. Declared a base class (abstract)
2. Declared NOT a base class (final)
3. Don't care.
I'd say most classes fall in category 3. For that, I think having virtual
by default isn't a hindrance, it's simply giving the most flexibility to
the user.
It also doesn't really address the problem where programmers just won't
do
that. Libraries suffer, I'm still inventing wheels 10 years from now, and
I'm wasting time tracking down slip ups.
What are the relative losses to the if it were geared the other way?
The losses are that if category 3 were simply always final, some other
anti-Manu who wanted to extend everything has to contact all the original
authors to get them to change their classes to virtual :)
BTW, did you know you can extend a base class and simply make the
extension final, and now all the methods on that derived class become
non-virtual calls? Much easier to do than making the original base
virtual (Note I haven't tested this to verify, but if not, it should be
changed in the compiler).
My one real experience on this was with dcollections. I had not declared
anything final, and I realized I was paying a performance penalty for
it.
I then made all the classes final, and nobody complained.
The userbase of a library will grow with time. Andrei wants a million D
users, that's a lot more opportunities to break peoples code and gather
complaints.
Surely it's best to consider these sorts of changes sooner than later?
I think it vastly depends on the intent of the code. If your classes
simply don't lend themselves to extending, then making them final is a
non-issue.
And where is the most likely source of those 1 million new users to
migrate
from? Java?
From all over the place, I would say. D seems to be an island of misfit
programmers.
-Steve