Re: libtool performance status
* Bob Friesenhahn wrote on Tue, Apr 22, 2008 at 05:18:34PM CEST: > On Tue, 22 Apr 2008, Ralf Wildenhues wrote: >> >> Can you please check whether >> libtool --tag=CXX --config >> >> is identical for each of the builds you timed? > > It is really not as easy as you presume for me to relibtoolize > GraphicsMagick for testing. I don't maintain many different autotools > installs and with 70+ builds simultaneously sharing one source tree such > changes become tedious and time consuming. Here's a cheap trick: unless you have extravagant ways of configuring libtool, you can just override LIBTOOL: make clean all LIBTOOL=/my/libtool-2.2.2/libtool make clean all LIBTOOL=/my/libtool-current/libtool The libtool scripts often don't even have to be installed. Cheers, Ralf
Re: libtool performance status
On Tue, 22 Apr 2008, Ralf Wildenhues wrote: To put some sort of proof to my claims, here's what I get building GM unoptimized (CFLAGS=-g) on GNU/Linux with Libtool 2.2.2. and current master (all timings best of three): [ stuff removed ] Looks like a small but definite improvement to me. :-) Good! With libtool 2.2.X I am really not noticing all that much overhead for users to complain about. There is not enough time to make a mad dash to the coffee machine, much less make it back in the time spent by libtool. Many of the functions that libtool does are necessary. Even a total dolt could eventually come to realize this. Note that I have not tested on the XO laptop (http://www.laptop.org/) to know what the actual impact is on children using libtool in disadvantaged countries with pedal or solar power. I know that the FSF does support this project since they had an XO laptop in their booth. Also note that there is only 1.05s of unaccounted-for elapsed time, also Probably just a difference in how the OS performs its accounting. Hmm. That may or may not be libtool's fault, though; linking in itself isn't so cheap, I/O-wise. Can you please check whether libtool --tag=CXX --config is identical for each of the builds you timed? It is really not as easy as you presume for me to relibtoolize GraphicsMagick for testing. I don't maintain many different autotools installs and with 70+ builds simultaneously sharing one source tree such changes become tedious and time consuming. Bob == Bob Friesenhahn [EMAIL PROTECTED], http://www.simplesystems.org/users/bfriesen/ GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Re: libtool performance status
* Bob Friesenhahn wrote on Mon, Apr 21, 2008 at 05:14:00PM CEST: > On Mon, 21 Apr 2008, Ralf Wildenhues wrote: >> >> If they were lower in between, and since increased, there must have been >> at least one regression along the way. Can you use git bisect to >> identify one? > > Remember that you fixed an expr-related bug which was impacting FreeBSD > and causing an error message to be displayed rather than code being > executed. Perhaps this fix adds a small cost? Unlikely. It removes a fork&exec and adds a pattern match and ${foo#bar} substitution. Both of the latter should be pretty quick compared with the former. To put some sort of proof to my claims, here's what I get building GM unoptimized (CFLAGS=-g) on GNU/Linux with Libtool 2.2.2. and current master (all timings best of three): 2.2.2: 110.31user 30.73system 2:21.01elapsed 100%CPU (0avgtext+0avgdata 0maxresident)k 0inputs+0outputs (0major+5714405minor)pagefaults 0swaps master: 107.42user 28.04system 2:16.51elapsed 99%CPU (0avgtext+0avgdata 0maxresident)k 0inputs+0outputs (0major+5069147minor)pagefaults 0swaps Looks like a small but definite improvement to me. :-) Also note that there is only 1.05s of unaccounted-for elapsed time, also the lower page fault count is a pretty good indicator that things have not gotten worse. To break things down further, I recorded (with make -n) most libtool calls, split them into --mode=compile and --mode=link, put them in a shell script, and added '-n --silent' so we only measure libtool script overhead. compile mode 2.2.2: 5.72user 2.30system 0:07.93elapsed 101%CPU (0avgtext+0avgdata 0maxresident)k 0inputs+0outputs (0major+537316minor)pagefaults 0swaps compile mode master: 4.95user 1.21system 0:06.14elapsed 100%CPU (0avgtext+0avgdata 0maxresident)k 0inputs+0outputs (0major+292254minor)pagefaults 0swaps link mode 2.2.2: 12.11user 7.92system 0:19.74elapsed 101%CPU (0avgtext+0avgdata 0maxresident)k 0inputs+0outputs (0major+1662149minor)pagefaults 0swaps link mode master: 11.91user 7.10system 0:18.94elapsed 100%CPU (0avgtext+0avgdata 0maxresident)k 0inputs+0outputs (0major+1450595minor)pagefaults 0swaps Again not much, but consistently better (and much better than 1.5.x). > Since only 54% of the time is attributed to user+system time the rest of > the time must be spent doing things like moving the disk drive heads, > waiting for I/O, servicing interrupts, or running other programs (none in > this case). Hmm. That may or may not be libtool's fault, though; linking in itself isn't so cheap, I/O-wise. Can you please check whether libtool --tag=CXX --config is identical for each of the builds you timed? Thanks Ralf
Re: libtool performance status
On Mon, 21 Apr 2008, Ralf Wildenhues wrote: If they were lower in between, and since increased, there must have been at least one regression along the way. Can you use git bisect to identify one? Remember that you fixed an expr-related bug which was impacting FreeBSD and causing an error message to be displayed rather than code being executed. Perhaps this fix adds a small cost? Since only 54% of the time is attributed to user+system time the rest of the time must be spent doing things like moving the disk drive heads, waiting for I/O, servicing interrupts, or running other programs (none in this case). Bob == Bob Friesenhahn [EMAIL PROTECTED], http://www.simplesystems.org/users/bfriesen/ GraphicsMagick Maintainer,http://www.GraphicsMagick.org/
Re: libtool performance status
Hi Bob, * Bob Friesenhahn wrote on Mon, Apr 21, 2008 at 12:55:08AM CEST: > I am saddened to report that as of yesterday, build times for git > libtool are now similar to libtool 2.2.2. This is not to say that there > has not been progress. The reported system times have gone down (from > 113 to 104) but the overall build times have crept back up so that they > are similar to before. This means that users end up waiting just as long > for the build to complete. If they were lower in between, and since increased, there must have been at least one regression along the way. Can you use git bisect to identify one? Thanks, Ralf
libtool performance status
Since libtool performance has become a concern, I have been measuring build times for a reference project in order to evaluate progress. I selected a project at random and ended up with one called 'GraphicsMagick'. I am testing the build on a FreeBSD 7.0 system with two 2.4GHz Intel Xeon CPUs. I am saddened to report that as of yesterday, build times for git libtool are now similar to libtool 2.2.2. This is not to say that there has not been progress. The reported system times have gone down (from 113 to 104) but the overall build times have crept back up so that they are similar to before. This means that users end up waiting just as long for the build to complete. I have attached the raw data, which includes some timings that I did with FreeBSD's /bin/sh and ksh93. Bob == Bob Friesenhahn [EMAIL PROTECTED], http://www.simplesystems.org/users/bfriesen/ GraphicsMagick Maintainer,http://www.GraphicsMagick.org/ PATH=/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/games:/usr/local/sbin:/usr/X11R6/bin:. /home/bfriesen/src/graphics/GraphicsMagick-head/configure 'CFLAGS=-g -O' 'CXXFLAGS=-g -O' '--enable-libtool-verbose' '--with-quantum-depth=16' --enable-shared '--with-modules' --without-perl Libtool 2.2.2: gmake -j 2 383.14s user 112.49s system 184% cpu 4:28.39 total gmake -j 2 387.55s user 112.91s system 184% cpu 4:31.33 total gmake -j 2 381.21s user 113.93s system 184% cpu 4:27.80 total gmake -j 2 383.57s user 110.94s system 184% cpu 4:28.25 total ltmain.sh (GNU libtool 1.2634 2008/04/11 17:21:54) 2.2.3a gmake -j 2 373.45s user 104.48s system 183% cpu 4:19.99 total gmake -j 2 366.93s user 106.25s system 182% cpu 4:19.03 total gmake -j 2 375.07s user 104.58s system 183% cpu 4:21.72 total gmake -j 2 373.90s user 104.60s system 183% cpu 4:20.25 total Using CONFIG_SHELL=/bin/sh gmake -j 2 345.53s user 82.93s system 180% cpu 3:56.79 total gmake -j 2 356.35s user 84.13s system 179% cpu 4:05.26 total Using CONFIG_SHELL=/usr/local/bin/ksh93 gmake -j 2 350.80s user 88.18s system 178% cpu 4:05.52 total gmake -j 2 349.59s user 87.31s system 179% cpu 4:03.20 total ltmain.sh (GNU libtool 1.2960 2008-04-19) 2.2.3a gmake -j 2 381.18s user 102.72s system 184% cpu 4:22.74 total gmake -j 2 381.66s user 103.63s system 182% cpu 4:25.81 total gmake -j 2 383.43s user 104.83s system 182% cpu 4:27.74 total gmake -j 2 384.38s user 104.25s system 182% cpu 4:27.78 total