On 7/8/06, David Lanouette <[EMAIL PROTECTED]> wrote:
Anybody know why methods aren't virtual by default in .NET?  It seems like a
really bad default to have all methods non-virtual.

Because in theory, there's an overhead in calling a virtual as opposed
to non-virtual method. In practice, if you have a performance hit from
the fact that you're calling virtual methods, you shouldn't write your
program for .NET anyway.

On a side note, all instance methods in Java are by default virtual
and you have to specifically mark them as non-virtual with the final
keyword. Now, from my experience, one non-trivial application has a
lot more non-virtual instance methods than virtual and I guess that's
the (practical) reason for Microsoft to choose non-virtual methods by
default and have the virtual/overridable/whatever keywords.

C# (and I believe VB.NET) is a different story - the compiler emits
code that performs virtual calls on both virtual and non-virtual
methods to guard you from the stupid mistake to call a non-virtual
method on a null instance, e.g.:

class X
{
   public void Foo()
   {
   }
}

X x = null;
x.Foo(); // was OK in the early days, now throws NullPointerException
because Foo is called as a virtual method (w/ callvirt).

There's no overhead in calling a non-virtual function w/ callvirt, so
the compiler "hack" is a good thing.

Cheers,
Stoyan

===================================
This list is hosted by DevelopMentorĀ®  http://www.develop.com

View archives and manage your subscription(s) at http://discuss.develop.com

Reply via email to