On Thursday, 31 August 2017 at 17:39:35 UTC, H. S. Teoh wrote:


This is just one example off the top of my head; I'm sure there are plenty of others that we can come up with once we break free of the C++ mold of thinking of templates as "copy-n-paste except substitute X with Y". Another example that just occurred to me is the various recursive templates in std.meta for manipulating AliasSeq's. There is actually no need for the compiler to instantiate anything at all, except perhaps for the final result. By treating the AliasSeq as an in-compiler data type with compile-time operations performed on it, the compiler ought to be able to evaluate the template without needing to instantiate anything past the first level of recursion.


T

That is a good example and it is one where we could maybe deduce what is doing on. However the classification of a template (rather the deduction of unused intanciations) actually requires us to instantiate the template. Because we cannot predict to what it might evaluate given specific parameters. For this to work we have to produce all possible instantiation behaviors of a template, and deduplicate this set.
Which happens to be infinite, in most cases.
Even predicting if the set of instantiations can be finite is essentially an instance of the halting problem.

I've worked on this for some time until I realized that I was fighting without gaining ground.

It would end up a myriad of special cases and would be impossible to maintain.

Considering that the current template system which does not try anything smart is already fiendishly complicated, it'd rather not introduce an even more complicated template-instance-shape-predictor which will probably only work on the simplest of cases and is not guaranteed to bring wins that would justify the time it'll take to implement it.

TL;DR

The change to optimize certain groups of templates will require MASSIVE amounts of work and is unlikely to pay for itself.

Reply via email to