On Thu, 13 Nov 2003, Pete Lomax wrote: > I'd be interested to see what sort of signature changes between > compile and runtime you think are likely to happen, as I have to admit > I have never encountered such a beast. Doesn't that force > non-prototyped calls?
I've seen it with some depressing regularity over the years. It generally takes the form of an upgrade to a library that breaks existing executables, something we're going to have to deal with as we're looking to encourage long-term use of bytecode-compiled programs. But there are several issues here: 1) vararg calls with non-pmc registers involved 2) Runtime modification of sub definitions 3) Drift in interface definitions There are definite performance issues--there are at least four integer stores and for paranoid subs for integer comparisons. The calling conventions make it reasonably clear, though, that they're there because definitions may change, and the engine doesn't place restrictions on how they change. Because of that we have to pass in sufficient information to validate things at the interface, which means at least arg counts. It's easy to lose sight of the characteristics of our target languages since we don't have any fully-functional compilers for them yet, so we've got to be careful. Dynamism is fundamental to the engine and the calling conventions are a recognition of that fact. (Doesn't matter whether I *like* it or not, it's the reality we have to deal with) If someone wants to propose we have an alternate, more static convention that lends itself better to one-off static linking with link-time signature checking for verification, which is what the complaints all seem to allde to, well... go ahead and if you do we'll see where we go from there. Dan --------------------------------------"it's like this"------------------- Dan Sugalski even samurai [EMAIL PROTECTED] have teddy bears and even teddy bears get drunk