Hi, for a project in the office i wrote a Makefile. There we use ClearCase for version control. We can get a copy of the project to the local hard drive, then the Makfile works quite fast and without problems.
We can also get a faked network drive from ClearCase with the project files in it. Both have the same directory structure. When we try to compile the sources on the faked network drive, everything takes so incredibly long. In that project previously a quite simple script was used to stupidly compile all the sources. People now complain that this worked better than the Makefile. Is there a way for me to test now what takes so long? Could you give me any hints on what could make sense to measure and where (by changing the sources) to measure it? I could think of: - the "stat"s - the commands started from "make" - searching for dependencies - internal calculations Thanks for any hints, Torsten. _______________________________________________ Help-make mailing list [email protected] http://lists.gnu.org/mailman/listinfo/help-make
