On Thu, Sep 25, 2014 at 07:42:08PM +0200, Jacob Carlborg via Digitalmars-d wrote: > On 2014-09-25 16:23, H. S. Teoh via Digitalmars-d wrote: > > >That's the hallmark of make-based projects. > > This was Ninja actually. But how would the build system know I've > updated the compiler? [...]
The compiler and compile flags are inputs to the build rules in SCons. In my SCons projects, when I change compile flags (possibly for a subset of source files), it correctly figures out which subset (or the entire set) of files needs to be recompiled with the new flags. Make fails, and you end up with an inconsistent executable. In my SCons projects, when I upgrade the compiler, it recompiles everything with the new compiler. Make doesn't detect a difference, and if you make a change and recompile, suddenly you got an executable 80% compiled with the old compiler and 20% compiled with the new compiler. Most of the time it doesn't make a difference... but when it does, have fun figuring out where the problem lies. (Or just make clean; make yet again... the equivalent of which is basically what SCons would have done 5 hours ago.) In my SCons projects, when I upgrade the system C libraries, it recompiles everything that depends on the updated header files *and* library files. In make, it often fails to detect that the .so's have changed, so it fails to relink your program. Result: your executable behaves strangely at runtime due to wrong .so being linked, but the problem vanishes once you do a make clean; make. Basically, when you use make, you have to constantly do make clean; make just to be sure everything is consistent. In SCons, the build system does it for you -- and usually more efficiently than make clean; make because it knows exactly what needs recompilation and what doesn't. In my SCons projects, when I checkout a different version of the sources to examine some old code and then switch back to the latest workspace, SCons detects that file contents haven't changed since the last build, so it doesn't rebuild anything. In make, it thinks the entire workspace has changed, and you have to wait another half hour while it rebuilds everything. It doesn't even detect that all intermediate products like .o's are identical to the last time, so it will painstakingly relink everything again. SCons will detect when a .o file hasn't changed from the last build (e.g., you only changed a comment in the source file), and it won't bother relinking your binaries or trigger any of the downstream dependencies' rebuilds. Make-heads find the idea of the compiler being part of the input to a build rule "strange"; to me, it's common sense. What's strange is the fact that make doesn't (and can't) guarantee anything about the build -- you don't know for sure whether an incremental build gives you the same executable as a build from a clean workspace. You don't know if recompiling after checking out a previous release of your code will actually give you the same binaries that you shipped 2 months ago. You don't know if the latest build linked with the latest system libraries. You don't know if the linked libraries are consistent with each other. You basically have no assurance of *anything*, except if you make clean; make. But wait a minute, I thought the whole point of make is incremental building. If you have to make clean; make *every* *single* *time*, that completely defeats the purpose, and you might as well just put your compile commands in a bash script instead, and run that to rebuild everything from scratch every time. Yeah it's slow, but at least you know for sure your executables are always what you think they are! (And in practice, the bash script might actually be faster -- I lost track of how much time I wasted trying to debug something, only to go back and make clean; make and discover that the bug vanished. Had I used the shell script every single time, I could've saved how many hours spent tracking down these heisenbugs.) T -- Gone Chopin. Bach in a minuet.
