Hi!

Just a few comments.

Jens-Heiner Rechtien wrote:
> Hi Ause,
> 
> thanks for breaking down the numbers to see which change has which
> effect on build times.
> 
> For completeness I did another experiment with a slightly other angle
> but the results seem to be in a good agreement with your numbers.
> 
> ======================================================================
> 
> It's known that the cygwin/tcsh/guw/dmake combination isn't exactly the
> fastest on earth, which is why everyone who owns a 4NT shell still seems
> to prefer building with 4NT. Actually it's not 4NT itself which is
> preferred but any other shell which doesn't impose the severe speed
> penalties of tcsh on cygwin will do (MSYS based bash comes to mind, or

Well, I guess ause already used bash (cygwin) (it's a few percent faster
than tcsh). The biggest performance hog was guw.pl (really slow regex
replacements on *every* command) and using a compiled executable to do
this task helped a lot. I'll do a comparison build with 4NT soon, but there
are also some advantages for cygwin's dmake - it has a parallel mode.
The native W32 dmake version  cannot use the dmake -Px option - and I
usually build with `build -- -P2` .

> the native command shell of WINXP which is much improved over the DOS
> shell of Windows98 years ago).
*shudder* :)

On the positive side we currently have a lot of changes in the pipeline
to speed up the build significantly - that's good!

  Volker

> 
> I also wanted to get a feeling which improvements are to be expected by
> jam itself in comparison to damke with a more sensible tool combination
> than cygwin/tcsh/guw and without the optimizations which are independent
> of jam.
> 
> An easy way to get a first ball park number is to wrap every actual
> execution of the Windows compiler with a timer and append the compile
> timings to a log file. Subtracting the sum of the compile times from
> total build time will give an upper limit of the percentage of the build
> time on which jam can optimize on.
> 
> I did this for building module "sc" in a completely local environment
> with module wide dependencies, which corresponds to the RE
> recommendation doing Windows builds inside the Hamburg environment. With
> other words, this is the setup most Sun developers will use when doing
> extensive OOo builds on Windows. I did a full build inside the module,
> removing the "sc" output trees ("common.pro" and "wntmsci10.pro")
> beforehand.
> 
> The total build time on my PIV 1800 was 68.82 minutes. The added compile
> time of 589 files was 48.14 minutes, representing about 70% of the total
> build time. Compiling stuff is the major part of the total build time
> (at least for module sc) which isn't that surprising. The remaining 30%
> incorporate all the remaining activities like linking, building
> resources, creating dependecies, times spend in dmake, the overhead of
> this measurement etc. It's clear that jam can only improve on this 30%
> and since linking and building resources will still be necessary only on
> a part of these remaining 30%.
> 
> Creating dependencies is a major part of the remaining 30%, no doubt,
> and there is a significant inefficiency in the current build system
> regarding the creation of these dependency information. This has been
> fixed with the upcoming CWS ause060. I was interested in the effect of
> fixing these inefficiency so I merged the changes of ause060 and did the
> build again. This time I got: 62.49 min total build time and 48.16 min
> compile time. The compile time staid the same (as it should) but the
> total build time was slashed by 6 min or 10%. The percentage of the
> compile time is now 77% on the total time, leaving jam only 23% of the
> build time to improve on, probably significant less if you consider that
> building resources and linking is still in here.
> 
> My conclusion: Jam has at most a 10-20% advantage (maybe less) over a
> dmake based build system if the latter is used with a decent shell and
> the same compile optimization are used (which is possible as Ause showed).
> 
> ======================================================================
> 
> Tschau,
>    Heiner
> 
> 
> Hans-Joachim Lankenau wrote:
>> hi!
>>
>> i was playing now for several days with the prototype of the jam
>> buildsystem and i would like to share my findings up to now.
>>
>> the main thing i've learned, is to have a look at the build output
>> instead of being focused on buildtimes only. more on this later...
>>
>> all numbers in this text are from my dedicated PIII 1800, a local OOo
>> build environment with bash (m181) and complete dependencies, doing a
>> complete build of sc, unless state otherwise.
>> note: complete dependencies seem to be somehow broken at the moment but
>> some simple checks seem to indicate that they are broken in a way that
>> doesn't affect build times.
>>
>> first vanilla build gave me the following numbers (don't care too much
>> about the seconds...):
>>
>> 158m01s - regular dmake build
>>  55m22s - plugged in jam
>>  29m21s - pure jam
>>
>> "plugged in jam" means build all targets jam is currently able to and
>> start a regular build for the remaining. of the targets required in the
>> module sc, currently the jam buildsystem isn't able to create .lib, .dll
>> and .res files.
>>
>> although not building all targets, the numbers for a pure jam build were
>> quite impressive.
>>
>> next thing i did was fetching some already existing optimizations from
>> the CWS vq35 (guw.exe, "-spawn" dmake) and ause060 (batched makedepend).
>> this brought down the dmake build times to
>>
>>  81m08s - dmake, -spawn, guw.exe, fast deps
>>
>> ok, not enough yet.
>>
>> since the jam build system already implemented two optimizations that
>> are build system independend, PCH (precompiled header) and batched
>> compiling, i tried to hack PCH usage into dmake (just for measurement,
>> clean implementation has to make sure that the used cimpiler options
>> match).
>>
>> while doing so, i noticed that the current jam prototype doesn't care
>> too much about compile switches and the PCH implementation is broken by
>> concept :(. so the resulting output of the current jam build are good
>> for measurement at best.
>>
>> using the broken PCH implementation (may be fixable for the majority of
>> object files but add yet another bit of complexity, regardless if jam or
>> dmake) for measurement, the times went down a bit more:
>>
>>  47m17s - dmake with hacked PCH
>>
>> taking into account that jam doesn't compile with optimization and
>> restricting the dmake build to the targets a pure jam build is able to
>> do, i got:
>>
>>  40m50s - almost compareable dmake build
>>
>> this build is still not using batched compiling which will give yet
>> another speedup...
>>
>>
>> regarding reduced complexity, my current impression is that the jam
>> build system is somehow "over simplified" to a degree where the output
>> is unusable without major rewrite. the whole gathering of compiler
>> switches and defines boils down to an almost static compiler line. the
>> PCH usage doesn't care if the precompiled files are compiled with the
>> same options, thus producing inconsistently compiled object files.
>> this of cause gives real simple Jamfiles and also a bit of speedup.
>>
>> i'm still fighting, not to compare apples with oranges but my current
>> feeling is that using jam instead of build/dmake will give a speedup of
>> 10-20 percent at best.
>>
>> tschau...
>>
>> ause
>>
>>
>>
> 
> 

-- 
= http://wiki.services.openoffice.org/wiki/Debug_Build_Problems  =
PGP/GPG key  (ID: 0x9F8A785D)  available  from  wwwkeys.de.pgp.net
key-fingerprint 550D F17E B082 A3E9 F913  9E53 3D35 C9BA 9F8A 785D

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to