Hello, I am working on a project which has more than 100M lines of code, more than 100K files and, more than 1000 makefiles. Makefiles apparently have bugs (may build too little or too much), are slow, messy, and violate "Paul's rules" and common decency on a routine basis. I am asked to "do something".
Well, seems that because of the size of the problem, fixing existing code is not an option. I have an idea - get the full build output from scratch, and write Perl scripts to process that output into well designed makefiles. It seems it should be possible to figure out, based on output, what the shell commands are that are used, what the arguments are, which arguments are options, which are files, which are generated from other shell commands, and so forth... and figure out the makefiles from that. Has anybody heard of this kind of thing being tried on a large project? Is it hopeless? Do I not see some fundamental obstacle which makes it not possible in practice? Perhaps somebody already wrote some scripts to do this kind of thing? Thank you for any advice, Mark PS. Larry Wall says good programmers are lazy. I am not good (not in Larry's intended meaning), but at least, I am really lazy (in his intended meaning) :) _______________________________________________ Help-make mailing list [email protected] https://lists.gnu.org/mailman/listinfo/help-make
