On Mon, May 31, 2010 at 11:54 AM, Jakub Jelinek <ja...@redhat.com> wrote: > On Mon, May 31, 2010 at 09:44:08AM -0700, Mark Mitchell wrote: >> >> I just really hope we will have strict criteria that any transition will >> >> not make compiler slower and will not increase compiler build time. >> > >> > Nor will grow the memory footprint, at least of the important data >> > structures, or increase maintanance costs by making the code less readable, >> > etc. >> >> There is no reason that use of C++ should increase the memory footprint >> of the compiler, make the compiler slower, or make the code less >> readable. Poor use of C++ might lead to those things; good use will >> not. That is why we need coding standards and patch review. > > E.g. when you start using virtual methods, suddenly you need a vtable > pointer in the object and thus the object grew by 8 bytes.
aren't we already doing this with the various hooks we have? We do not need to generate RTTI if all we are interested in is vcalls. > In many cases it would be really addition of that pointer, dropping an 8/16 > bit code would slow things down too much (using virtual method to get you > say enum tree_code from a tree would be way too slow, similarly for rtti). Currently, we go through lot of manual checking with globals tables where the vcall mechanism already ensures that. I would suggest that when we are worrying about this kind of this, we also look at the alternatives we currently have and how they compare. > > Similarly if the compiler massively starts using virtual methods everywhere, > there will be slow downs caused by the increased number of harder to predict > indirect calls. that is why reviewers will use their best judgements, in particular to decide where a virtual function is preferable to a huge switch, current practice, which will generate more data than a vtable. > > Jakub >