On 25.07.2012 23:31, Andrei Alexandrescu wrote:
On 7/25/12 4:53 PM, Rainer Schuetze wrote:


On 25.07.2012 19:24, Walter Bright wrote:
On 7/25/2012 8:13 AM, Andrei Alexandrescu wrote:
Yes, and both debug and release build times are important.

Optimized build time comparisons are less relevant - are you really
willing to trade off faster optimization times for less optimization?

I think it's more the time of the edit-compile-debug loop, which would
be the unoptimized build times.



The "edit-compile-debug loop" is a use case where the D module system
does not shine so well. Compare build times when only editing a single
source file:
With the help of incremental linking, building a large C++ project only
takes seconds.
In contrast, the D project usually recompiles everything from scratch
with every little change.

The same dependency management techniques can be applied to large D
projects, as to large C++ projects. (And of course there are a few new
ones.) What am I missing?

Incremental compilation does not work so well because

- with combined declaration and implementation in the source, you also get the full dependencies if you just need a short declaration - even with di-files imports are viral: you must be very careful if you try to remove them from di-files because you might break runtime initialization order. - di-file generation has other known problems (e.g. missing implementation for CTFE)

I thought about implementing incremental builds for Visual D, but soon gave up when I noticed that a single file compilation in a medium sized project (Visual D itself) almost takes as long as recompiling the whole thing.

I suspect the problem is that dmd fully analyzes all the imported files and only skips the code generation for these. It could be much faster if it would do the analysis lazily (though this might slightly change evaluation order and skip error messages in unused code blocks).

Reply via email to