Virtual is a pretty bad default.

To make a method *usefully* overridable takes more than just marking it
virtual. You also need to come up with answers to questions like: should
the overriding method call the original base function, and if so, should
that be before, during, or after their replacement implementation? If
there are several overloads of the method available, are most of them
implemented in terms of other methods (a common technique), in which
case should the deriving class override just one method, and if so,
which is the one to override? Does the base class ever call this method
itself, and if so, under what conditions? (E.g. is there anything
peculiar like it gets called during construction?) Does the class have
some associated form of state machine (e.g. does it have multi-phase
construction?) and are there rules that relate the way the overridden
method is used or behaves with respect to that state machine? Are there
any peculiarities or quirks of this method that other code might be
relying on, and are these documented anywhere?

If you look at frameworks that have evolved over a long time, and which
make heavy use of inheritance, they always carry lots of scars from this
kind of decision. Windows Forms and WPF are two examples of this, WPF in
particular. (Probably because WPF has been evolving for so long.) The
documentation has loads of little notes describing considerations for
deriving classes, detailing the circumstances over which virtual methods
should be called, and what the rest of the framework or its clients will
be expecting. And this is in a framework where only a carefully selected
set of members are actually overridable. Indeed, in some cases they
split functionality in half, so you see pairs of methods, one containing
the replaceable bits of the logic, and the other containing the core
stuff you don't want to lose just because you overrode something.

It takes a lot of work and a lot of iterations to design a class
hierarchy where virtual methods work really well. So it should come as
no surprise that if you don't do any of this ground work, overriding
methods is going to be a pretty risky affair. Overriding a method that
wasn't designed with overriding in mind always runs the risk of removing
a chunk of important logic from the class. You've just cauterized part
of the object's brain - you'd better hope you can recreate whatever that
method was doing correctly.

And yet there's this persistent myth that if a method is virtual, it
will automatically be OK to derive it. This is naive wishful thinking.

(I think the performance thing is misleading by the way. Modern JVMs
detect when a particular call site always calls the same implementation
of a virtual function, and this enables them to optimize that method
call as though it were non-virtual. They can even inline virtual
function calls. So there's no overhead to using virtual functions in
Java unless you're actually exploiting their virtualness. The only
wrinkle in this story is that Java takes longer to get up to full speed
because of this - it detects monomorphic call sites at runtime, so it
can only perform this optimization after it has detected that. This
works great for servers where amortized average cost is the main thing.
It's not so great for UIs...)


The biggest problem today that keeps driving the (IMO) misguided demands
for making everything virtual is that virtual methods also happen to
look superficially attractive for an technique that really has nothing
to do with overriding methods provided by base classes: testability.
Indeed, that argument is very much the basis of this:

 
http://blogs.objectmentor.com/ArticleS.MichaelFeathers.ItsTimeToDeprecat
eFinal

Here Michael Feathers argues for banning 'final' (Java's keyword for
"not virtual", equivalent of 'sealed' in C#) entirely because it makes
testing hard. But to me, that article reads like a classic case of "if
all you've got is a hammer, everything looks like a nail" thinking. Yes,
testing final methods is awkward. But making everything virtual instead
is, drastic, clumsy, and dangerous.

I think there are preferable solutions. If the class you'd like to mock
for testing purposes happens to be a MarshalByRefObject, you can
generate a transparent proxy for the thing - an object which looks
exactly like the real thing (even returning the expected type from
GetType()) but which is in fact a fake under your control. That way you
get to mock the whole object without even deriving from it - no need to
own anything. This works brilliantly. The main problem is you can't
always use this technique because not everything derives from
MarshalByRefObject. But it does demonstrate pretty clearly that making
everything virtual is actually a pretty crude approach, and it's
certainly possible to envisage better alternatives.

So how could you mess with the build and test environment to create
similar solutions when dealing with non-MBROs? Well one obvious and
low-tech approach would be to build a class that looks just like the
thing you want to mock, and compile against that rather than the real
thing. (You'd want to generate this mock of course - it'd be tedious to
write by hand.) There are numerous logistical challenges here, which I
suspect is one reason you don't see this much. It's just so much easier
to do everything through interfaces, even though that's barely any more
elegant than the "everything virtual whether it makes sense or not"
style.


-- 
Ian Griffiths - Pluralsight
http://www.interact-sw.co.uk/iangblog/


-----Original Message-----
From: David Lanouette
Sent: 08 July 2006 00:37

Anybody know why methods aren't virtual by default in .NET?  It seems
like a really bad default to have all methods non-virtual.

===================================
This list is hosted by DevelopMentor®  http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com

Reply via email to