On Wed, Jun 15, 2016 at 05:04:28AM +0000, Jason White via
> On Tuesday, 14 June 2016 at 10:47:58 UTC, Fool wrote:
> > A possible use case is creating object files first and packing them
> > into a library as a second step. Then single object files are of not
> > much interest anymore. Imagine, you want to distribute a build to
> > several development machines such that their local build
> > environments are convinced that the build is up to date. If object
> > files can be treated as secondary or intermediate targets you can
> > save lots of unnecessary network traffic and storage.
> You're right, that is a valid use case. In my day job, we have builds
> that produce 60+ GB of object files. It would be wasteful to
> distribute all that to development machines.
> However, I can think of another scenario where it would just as well
> be incorrect behavior: Linking an executable and then running tests on
> it. The executable could then be seen by the build system as the
> "secondary" or "intermediate" output. If it gets deleted, I think we'd
> want it rebuilt.
> I'm not sure how Make or Shake implement this without doing it
> incorrectly in certain scenarios. There would need to be a way to
> differentiate between necessary and unnecessary outputs. I'll have to
> think about this more.
I don't think Make handles this at all. You'd just write rules in the
Makefile to delete the intermediate files if you really care to. Most of
the time people just ignore it, and add a 'clean' rule with some
wildcards to cleanup the intermediate files.
(This is actually one of the sources of major annoyance with Makefiles:
because of the unknown state of intermediate files, builds are rarely
reproducible, and `make clean; make` is a ritual that has come to be
accepted as a fact of life. Arguably, though, a *proper* build system
ought to be such that incremental builds are always correct and
reproducible, and does not depend on environmental factors.)
Not all rumours are as misleading as this one.