On Friday, 27 September 2013 at 06:40:22 UTC, Jacob Carlborg wrote:
On 2013-09-27 01:47, Jonathan M Davis wrote:

Fast enough that you'd have to have a very large project for incremental builds to gain you anything. Maybe you could gain time if the incremental builds were done in parallel, but even then, the build time is going to be dwarfed by the link time in a lot of programs. Most projects aren't going to be big enough to really gain much from incremental builds. I'd only worry about that if I were doing a large application and building it was
demonstratively slow.

I don't know how others think about incremental compilation. But I'm don't think about compiling each file separately. I'm thinking about compiling only what's changed and compile all those files in one go.

That would mean the first time you compile a project it would compile all files at once, just as most people do today. Then when some files are changed it will only compile those, in one go.

This should at least in theory speed up the compilation. But perhaps most projects are too small to actually make a difference in practice.


From my enterprise seat, I tend to favor compilation against binary modules.

It is not as if you have always source code available and D has modules, so I expect eventually to make use of binary modules like in most languages that have module support.


--
Paulo

Reply via email to