On Saturday, 4 April 2015 at 07:44:12 UTC, Atila Neves wrote:
On Friday, 3 April 2015 at 19:54:09 UTC, Dicebot wrote:
On Friday, 3 April 2015 at 19:08:58 UTC, weaselcat wrote:
I just tried compiling one of my project. It has a makefile that does separate compilation and a shell script I use for unit testing which compiles everything in one go. The makefile takes 5.3 seconds, does not including linking since it builds a library. The shell script takes 1.3 seconds which include compiling unit tests and linking as well.

change one file and see which one is faster with an incremental build.

I don't care if incremental build is 10x faster if full build still stays at ~1 second. However I do care (and consider unacceptable) if support for incremental builds makes full build 10 seconds long.

I'm of the opposite opinion. I don't care if full builds take 1h as long as incremental builds are as fast as possible. Why would I keep doing full builds? That's like git cloning multiple times. What for?

What's clear is that I need to try Andrei's per-package idea, at least as an option, if not the default. Having a large D codebase to test it on would be nice as well, but I don't know of anything bigger than Phobos.

At work I often switch between dozen of different projects a day with small chunk of changes for each. That means that incremental builds are never of any value.

Even if you consistently work with the same project it is incredibly rare to have a changeset contained in a single module. And if there are at least 5 changed modules (including inter-dependencies) it becomes long enough already.

As for test codebase - I know that Martin has been testing his GC improvements on Higgs (https://github.com/higgsjs/Higgs), could be a suitable test subject for you too.

Reply via email to