Am Sun, 6 Dec 2009 22:13:47 +0100 schrieb Thorsten Behrens <[email protected]>:
> That's not what I meant. The reason to also hand over a specific > dmake option for multi-processor builds is that sometimes only one > or two directories have their actual dependencies satisfied & are > thus buildable. So it would be ~trivial to only hand "-P16" over to > build.pl, and let it distribute that across the active dmake > instances. No, thats not trivial: the current bottlenecks cant really be noticed by build.pl until its to late (unless build.pl kills runnning dmake-processes for a directories and restarts them again, which will hardly improve performance and might trigger additional subtile and hard to debug bugs in the build system). This is because bottlenecks happen, when a big module with many dependencies -- like svx -- is still building and all unbuild modules depend on it directly or indirectly. However, once build.pl notices that it cant start another dmake because of deps on svx, the dmakes of svx are already long running (with little paralellization) and the only way to allow those more paralelization is to kill and restart those. Its an inherent problem of the current architecture. > > > - what kind of dependency tracking is missing in the current > > > system? > > Those that bite you on compatible builds. > > > Ah. That's what I thought. So nothing inherently missing in > dmake/build.pl, but "just" bugs in the makefiles. Yes, and going to the moon is "just" going up long enough ... ;-) And its not only the makefiles, the build.lst and d.lst files are far more evil. > Sorry, but > http://hg.services.openoffice.org/hg/cws/gnu_make/raw-file/2518db232510/sw/prj/target_lib_msword.mk > then leaves a lot to be desired. Like, what? > There's still loads of redundancy - I cannot see a lot of redundancy there, unless one is willing to reorganize quite a bit of the files in the source tree: - clean up the weird include paths in sw - for each shared library, keep all files in one directory That would allow to collect the source files implicitly (still, collecting source files dynamically might be a performance issue for complete (re-)builds). I wanted the prototype to be able to work without major refactorings on the source itself, which is actually _harder_ to archive than when one allows the new build system to completely reorder files in the source tree. That does not mean we should not do such bigger changes per se, just that we do not _have_ to. > Yes. And? It's still significantly lowering the barrier of entry. > Loads of other projects do it (with the same limitation). An digression beforehand: I guess we are talking about Windows here mostly, makefiles are the "native build system" on pretty much all *nix-systems. On Windows, anyone getting all the prerequisites setup right will not even notice any difficulty living without read-only project files.</digression> Other than that, I really like some release and QA engineers opinions on this, because I am only guessing on their opinion. We already have more than enough issues with (non-)reproducible builds (weather by rebuilds, that slightly differ from a complete build, platform specific stuff or other reasons). When I talked with release engineers and QA engineers, I got the feeling they where not interested in asking back on every issue, which build environment the dev used to make a build (or worse: QA waves through a cws, because the tested builds look fine, but when building the "offical" way in releng, subtile isssues show up on the master). Another issue with using an external project generator is versioning. If for example we would use cmake to generate Visual Studio project files, we might find different versions creating different project files (maybe even in best faith by bugfixes) and thus causing subtle bugs by updating the project generator. Also an external project generator using a abstraction as high level as cmake does might develop in a way that diverges from what we would like to do. Of course, we could require only an old version to be used or fork -- but then we are no better of than we currently are with dmake. So a generated project file might introduce subtle different builds, is dependent on the version of the generator and is read-only. Since you can "generate a project from existing code" trivially in NetBeans, Visual Studio and others and just as well kick of a makefile-based build in all modern IDEs, what exactly is gained by generated project files? I dont at all see how this is lowering the barrier to entry as it just introduces another concept to learn for a newcomer, but it does not make life easier in any relevant way. Nothing more confusing to a newcomer than leaky abstractions(1). > So stepping back a bit, I should probably mention (again) that I > find the general idea really great - the make system is truly > arcane & leaves a lot to desire. But despite your initial request for > input, plans & thoughts, I cannot but have the impression you're > already quite determined to follow the outlined route - sadly the > build system has seen quite a few attempts for change; and survived > all of them ~unmodified, thus some extra thought & consultation is > surely advisable ... Well, you wont wake up tomorrow and have OOo use a new build system. The work that was done with gnu_make is a prototype. There was certainly a lot learned while creating it. Still "extra thought and consultation" is of course needed and we are in no hurry, but in the end there will be a point were -- as Linus said -- it comes down to: "Talk is cheap. Show me the code." ;-). > From my humble point of view, what has usually worked best in OOo > land is some iterative approach to change; which in this case & my > opinion would mean cleaning up makefiles one by one, either using a > declarative DSL directly that could later be mapped to gnu make or > whatever tool we see fit, or - using a meta-language like automake > cmake, or something homegrown, *still* use dmake for the while, and > then, after some critical mass has been attained, switch the make > tool wholesale (and adapt the metalang-to-makefile generator). When discussing possible migration paths, we considered to migrate on a module-by-module basis: keeping build.pl and replacing dmake in one module after the other and finally getting rid of build.pl. > Additionally, and since you mentioned the desire to have only one > make instance - last time someone tried to have gnu make hold all of > OOo's dependency tree in one process, that guy (Kai Backman) ended > up with absolutely pathetic performance & ridiculous mem usage. I did more than a few scalability tests with GNU make. There is a magic barrier on the number of rules it can handle (above ~10000 scalability grinds to a halt), but rarely one on targets. Unfortunately, we will only really know in the end (In theory, theory and practice are same. In practice -- not.). But the four modules in the prototype are already a significant part of the source files and probably a major part of the header targets (via #include). Best Regards, Bjoern (1) http://www.joelonsoftware.com/articles/LeakyAbstractions.html --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
