On Thu, 13 Mar 2014 21:42:43 -0000, Walter Bright <[email protected]> wrote:

On 3/13/2014 1:09 PM, Andrei Alexandrescu wrote:
Also let's not forget that a bunch of people will have not had contact with the group and will not have read the respective thread. For them -- happy campers who get work done in D day in and day out, feeling no speed impact whatsoever from a virtual vs. final decision -- we are simply exercising the brunt of a deprecation cycle with undeniable costs and questionable (in Walter's and my
opinion) benefits.

Also,

     class C { final: ... }

achieves final-by-default and it breaks nothing.

Yes.. but doesn't help Manu or any other consumer concerned with speed if the library producer neglected to do this. This is the real issue, right? Not whether class *can* be made final (trivial), but whether they *actually will* *correctly* be marked final/virtual where they ought to be.

Library producers range in experience and expertise and are "only human" so we want the option which makes it more likely they will produce good code. In addition we want the option which means that if they get it wrong, less will break if/when they want to correct it.


Final by default requires that you (the library producer) mark as virtual the functions you intend to be inherited from. Lets assume the library producer has a test case where s/he does just this, inherits from his/her classes and overrides methods as they see consumers doing. The compiler will detect any methods not correctly marked. So, there is a decent chance that producers will get this "right" w/ final by default.

If they do get it wrong, making the change from final -> virtual does not break any consumer code.


Compare that to virtual by default where marking everything virtual means it will always work, but there is a subtle and unlikely to be detected/tested performance penalty. There is no compiler support for detecting this, and no compiler support for correctly identifying the methods which should be marked final. In fact, you would probably mark them all final and then mark individual functions virtual in order to solve this.

If they get it wrong, making the change from virtual -> final is more likely to break consumer code.


I realise you're already aware of the arguments for final by default, and convinced it would have been the best option, but it also seems to me that the "damage" that virtual by default will cause over the future lifetime of D, is greater than a well controlled deprecation path from virtual -> final would be.

Even without a specific tool to aid deprecation, the compiler will output clear errors for methods which need to be marked virtual, granted this requires you compile a program which "uses" the library but most library producers should have such a test case already, and their consumers could help out a lot by submitting those errors directly.

Regan.

--
Using Opera's revolutionary email client: http://www.opera.com/mail/

Reply via email to