On Mon, Jun 04, 2012 at 06:38:46PM +0200, Michael Stahl wrote: > who cares how big the files are (disk is cheap), the relevant metric is: > how many seconds does make need to parse them? > ... > yes, but keep in mind that variables in the dep files will need to be > expanded by make, which will likely result in memory allocations, so > it's an open question whether that will actually improve performance or > slow it down.
IIRC I made some measurements back then. Though not very scientific, they suggested it makes no difference at all. > >> - a significant part of .d content is the depend-on-nothing deps created > >> by -MP , if those would be merged into one dedicated .d file that'd save a > >> lot of space as well; not sure if this is easily doable though > > > > Arent we doing that already when merging the .d files for one library? > > to some extent yes, but of course the same headers are included in many > libraries, so there is still some amount of duplication there; however i > don't know to improve this without breaking separate building of > modules, which requires the LinkTarget .d files to be self-contained. > > > So right now, I consider the topic premature optimization until proven > > otherwise. > > > (*) which you wont unless you gzip them (which is doable and shouldnt have > > too > > big of an performance impact) > > sadly AFAIK make cannot include compressed files... We never include the dep-files of the objects ( $(WORKDIR)/Dep/CxxObject ), only the concated per-library output ( $(WORKDIR)/Dep/LinkTarget ), right? Best, Bjoern _______________________________________________ LibreOffice mailing list LibreOffice@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/libreoffice