At 03:44 AM 6/21/2007, Philip Guenther wrote:
On 6/21/07, Divakar Venkata (divvenka) <[EMAIL PROTECTED]> wrote:
We use non-recursive make system with over 1000+ fragmented
makefiles all over source code. We have seen make spending
lot of time in reading these files and calculating dependency graph
before starting build.
I also agree with Philip, with a couple of additional points:
1. I measure the parsing phase by adding a phony target called
"nothing". Then, running
time make nothing
should, in theory, do a good job approximating the cost of the first
(parse) phase.
2. I have a similar non-recursive, include-based build model. The
number of our makefile fragments is smaller (~250) but each is large
(lots of lines) and dense (long lines). Here is the report issued by
wc (characters, words, bytes) for my makefile set:
69530 351573 6729699 total
GNU make takes only a few seconds to parse this. My experience argues
that the parsing time is almost always negligible, because if your
build is of a scale that parsing time is even noticeable the build
time will be so large as to dwarf it. Of course there's always the
case of the "null build" (where make simply reports everything being
up to date) but really, if you have (say) an 8-hour build and it
takes (say) 60 seconds to determine that nothing needs to be done,
you should spend those 60 seconds saying "Thank you Stuart Feldman"
(original author of make).
David Boyce
_______________________________________________
Help-make mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/help-make