On 6/21/07, Divakar Venkata (divvenka) <[EMAIL PROTECTED]> wrote:
We use non-recursive make system with over 1000+ fragmented makefiles all over source code. We have seen make spending lot of time in reading these files and calculating dependency graph before starting build.
How have you measured that? Did you actually trace make's operation, because I don't see any other way to find the dividing point between the reading of the files and when it finishing deciding whether the makefiles are all up to date and starts building targets. Hmm, that's a thought: with 1000+ makefiles, make has to quite a bit of work to verify that the makefiles themselves are up to date. If the makefiles are not themselves subject to rebuilding, then you should consider telling make that none of them will ever need rebuilding by putting this: $(MAKEFILE_LIST): ; in the last makefile that's read.
Here, is there a way to instruct gmake to read these fragmented makefiles paralally and create dependency graph..?
Back up a step. What problem are you trying to solve? If it's the speed of make's operation before it starts building targets, perhaps you should check whether your makefiles are simply inefficient. For example, using ':=' assignments for variables involving functions can often speed things by reducing the number of times make has to perform a calculation. Reducing the number of files that make has to check for by disabling unneeded pattern rules can help too. Disabling rebuilding of the makefiles as I mentioned above cuts out that chunk of work. Philip Guenther _______________________________________________ Help-make mailing list [email protected] http://lists.gnu.org/mailman/listinfo/help-make
