bearophile wrote:
Following a short trail of links I've found this page, about how C# tests its
variable initialization, it's quite cute: http://en.wikipedia.org/wiki/Definite_assignment_analysis D is less smart and
more rough here, but to invent and implement this you may need more than one
developer... I don't think you can ask Walter to implement that, even if the
algorithm is known. I don't know how Mono devs have managed to implement it.

From the article:

"The second way to solve the problem is to automatically initialize all locations to some fixed, predictable value at the point at which they are defined, but this introduces new assignments that may impede performance. In this case, definite assignment analysis enables a compiler optimization where redundant assignments — assignments followed only by other assignments with no possible intervening reads — can be eliminated. In this case, no programs are rejected, but programs for which the analysis fails to recognize definite assignment may contain redundant initialization. The Common Language Infrastructure relies on this approach."

This is EXACTLY what D does, and did from day 1. The analysis to do it was implemented (by me) back in 1985 or so. It's OOOOOLLLLLDD technology, not some fabulous thing C# invented.

The term for it is "Dead Assignment Elimination." You can find it in compiler textbooks, even old ones. I learned about it in 1982 from Hennessey and Ullman.

I strongly suggest you take a tour through the comments in the source code of the dmd optimizer. The global optimizer code is in files starting with 'g'.

Optimizations not done by dmd:

1. loop unrolling, loop fusion
2. auto-vectorizing
3. use of sse fp registers

You can rag on dmd for not doing them, but please, most standard data flow analysis done by compilers is old hat, is done in dmd, has been for decades, and is not some new fangled thing recently invented by other compiler marketing departments.

Reply via email to