Hi All,
I had a quick question I was hoping someone could answer about the always emit callvirt instruction pattern. I had always understood that one of the great advantages of C# over Java was, as Miguel put it on his blog about porting Android from java, "virtual methods were made opt-in, instead of opt-out which made for simpler VMs." Which makes a lot of sense to me, no virtual call overhead. However, I was surprised to learn when disassembling code that both Mono and Microsoft seem to ignore this optimization, and emit all method calls as callvirt instructions in IL regardless of whether or not they are actually virtual method calls, which seems to defeat the whole point of not being java. A Microsoft employee blogged about this (http://blogs.msdn.com/b/ericgu/archive/2008/07/02/why-does-c-always-use-cal lvirt.aspx) and it seems that made this change to ensure that instance methods could not be called on null instance references (a debatable decision perhaps). I gather/suspect that Microsoft still optimizes the callvirt instruction during the JIT stage to call the method and check for a null reference, though they never say how large this performance penalty is. Does anyone know what the cost of this null reference check is? And if Mono also is able to optimize callvirt calls that are not actually virtual calls? Someone posted earlier today about a performance issue when comparing Microsoft versus mono virtual machines on windows, and it got me thinking about the expense of method calls, so I would be curious to know if anyone has the answer. Cheers, Nigel
_______________________________________________ Mono-list maillist - [email protected] http://lists.ximian.com/mailman/listinfo/mono-list
