Re: [Scons-dev] Mini announcement: v2.4 is near...
I almost wonder whether its related to the Ubuntu rpmbuild setup, since rpmbuild is native to the RHEL (fedora, CentOS) systems, but Ubunut (Debian, etc) use apt and dpkg. On Fri, Aug 7, 2015 at 3:20 AM, Dirk Bächle tshor...@gmx.de wrote: William, On 07.08.2015 03:39, William Blevins wrote: On Thu, Aug 6, 2015 at 9:23 PM, William Blevins wblevins...@gmail.com mailto:wblevins...@gmail.com wrote: I ran that test in a loop for 10m or so and never got a failure though it might only happen when you thread it with other tests? I see two potential issues: 1. WhereIs('rpm') vs WhereIs('rpmbuild'); those two processes have been split out for a very long time. 2. If rpm_build_root is not unique then it could conflict with the other rpmbuild tests. I tried running all the rpm tests with -j6 in a loop; again there were no errors, so I don't know for sure. Do you get a stack trace or something? there is no stacktrace, the test fails because within the build an update is triggered, when there should be none. (see below) But this happens only spuriously...calling the single test seems to make the frequency of failure lower, while running all rpm tests makes it occur more often (can't back this up with data right now, just a first impression). Dirk == dirk@ubuntu:~/workspace/scons_commit$ python runtest.py test/packaging/rpm 1/6 (16.67%) /usr/bin/python -tt test/packaging/rpm/cleanup.py STDOUT = 1,6c1,6 scons\:\ Reading\ SConscript\ files\ \.\.\.\ scons\:\ done\ reading\ SConscript\ files\.\ scons\:\ Building\ targets\ \.\.\.\ scons\:\ \`\.\'\ is\ up\ to\ date\.\ scons\:\ done\ building\ targets\.\ .* --- scons: Reading SConscript files ... scons: done reading SConscript files. scons: Building targets ... tar -zc -f foo-1.2.3.tar.gz foo-1.2.3/SConstruct foo-1.2.3/src/main.c foo-1.2.3/foo-1.2.3.spec TAR_OPTIONS=--wildcards LC_ALL=C rpmbuild -ta --buildroot /tmp/testcmd.3749._NfA8E/rpm_build_root /tmp/testcmd.3749._NfA8E/foo-1.2.3.tar.gz scons: done building targets. FAILED test of /home/dirk/workspace/scons_commit/src/script/scons.py at line 605 of /home/dirk/workspace/scons_commit/QMTest/TestCommon.py (_complete) from line 701 of /home/dirk/workspace/scons_commit/QMTest/TestCommon.py (run) from line 390 of /home/dirk/workspace/scons_commit/QMTest/TestSCons.py (run) from line 427 of /home/dirk/workspace/scons_commit/QMTest/TestSCons.py (up_to_date) from line 88 of test/packaging/rpm/cleanup.py 2/6 (33.33%) /usr/bin/python -tt test/packaging/rpm/explicit-target.py PASSED 3/6 (50.00%) /usr/bin/python -tt test/packaging/rpm/internationalization.py PASSED 4/6 (66.67%) /usr/bin/python -tt test/packaging/rpm/multipackage.py PASSED 5/6 (83.33%) /usr/bin/python -tt test/packaging/rpm/package.py PASSED 6/6 (100.00%) /usr/bin/python -tt test/packaging/rpm/tagging.py PASSED Failed the following test: test/packaging/rpm/cleanup.py dirk@ubuntu:~/workspace/scons_commit$ ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
Run the test with PRESERVE=1 then goto the created dir and run scons --debug=explain ? On Fri, Aug 7, 2015 at 7:20 AM, William Blevins wblevins...@gmail.com wrote: Would it be wise to try something like changing that line to a build with --explain, so we might be able to figure out why SCons thinks its out of date? On Fri, Aug 7, 2015 at 10:18 AM, William Blevins wblevins...@gmail.com wrote: I almost wonder whether its related to the Ubuntu rpmbuild setup, since rpmbuild is native to the RHEL (fedora, CentOS) systems, but Ubunut (Debian, etc) use apt and dpkg. On Fri, Aug 7, 2015 at 3:20 AM, Dirk Bächle tshor...@gmx.de wrote: William, On 07.08.2015 03:39, William Blevins wrote: On Thu, Aug 6, 2015 at 9:23 PM, William Blevins wblevins...@gmail.com mailto:wblevins...@gmail.com wrote: I ran that test in a loop for 10m or so and never got a failure though it might only happen when you thread it with other tests? I see two potential issues: 1. WhereIs('rpm') vs WhereIs('rpmbuild'); those two processes have been split out for a very long time. 2. If rpm_build_root is not unique then it could conflict with the other rpmbuild tests. I tried running all the rpm tests with -j6 in a loop; again there were no errors, so I don't know for sure. Do you get a stack trace or something? there is no stacktrace, the test fails because within the build an update is triggered, when there should be none. (see below) But this happens only spuriously...calling the single test seems to make the frequency of failure lower, while running all rpm tests makes it occur more often (can't back this up with data right now, just a first impression). Dirk == dirk@ubuntu:~/workspace/scons_commit$ python runtest.py test/packaging/rpm 1/6 (16.67%) /usr/bin/python -tt test/packaging/rpm/cleanup.py STDOUT = 1,6c1,6 scons\:\ Reading\ SConscript\ files\ \.\.\.\ scons\:\ done\ reading\ SConscript\ files\.\ scons\:\ Building\ targets\ \.\.\.\ scons\:\ \`\.\'\ is\ up\ to\ date\.\ scons\:\ done\ building\ targets\.\ .* --- scons: Reading SConscript files ... scons: done reading SConscript files. scons: Building targets ... tar -zc -f foo-1.2.3.tar.gz foo-1.2.3/SConstruct foo-1.2.3/src/main.c foo-1.2.3/foo-1.2.3.spec TAR_OPTIONS=--wildcards LC_ALL=C rpmbuild -ta --buildroot /tmp/testcmd.3749._NfA8E/rpm_build_root /tmp/testcmd.3749._NfA8E/foo-1.2.3.tar.gz scons: done building targets. FAILED test of /home/dirk/workspace/scons_commit/src/script/scons.py at line 605 of /home/dirk/workspace/scons_commit/QMTest/TestCommon.py (_complete) from line 701 of /home/dirk/workspace/scons_commit/QMTest/TestCommon.py (run) from line 390 of /home/dirk/workspace/scons_commit/QMTest/TestSCons.py (run) from line 427 of /home/dirk/workspace/scons_commit/QMTest/TestSCons.py (up_to_date) from line 88 of test/packaging/rpm/cleanup.py 2/6 (33.33%) /usr/bin/python -tt test/packaging/rpm/explicit-target.py PASSED 3/6 (50.00%) /usr/bin/python -tt test/packaging/rpm/internationalization.py PASSED 4/6 (66.67%) /usr/bin/python -tt test/packaging/rpm/multipackage.py PASSED 5/6 (83.33%) /usr/bin/python -tt test/packaging/rpm/package.py PASSED 6/6 (100.00%) /usr/bin/python -tt test/packaging/rpm/tagging.py PASSED Failed the following test: test/packaging/rpm/cleanup.py dirk@ubuntu:~/workspace/scons_commit$ ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
William, On 06.08.2015 20:59, William Blevins wrote: Dirk, I don't see information on the hardware and/or threads used for your profile attachment. Also, it is interesting that the update time slightly increased. I assume this is a side-effect of the lazy loading overhead. I am not worried about it though. Good work :) thanks for your feedback. I could list things like OS/machine/RAM...but this is all about the relative comparison of one commit to the other. In both runs the same machine and settings were used...does that help, or do you need more information? @all: I just pushed a new commit for fixing a few tests/tools that were still accessing the Node attributes directly, mainly .abspath and .path. Doesn't show up usually, because the __getattr__ is in place in Node/FS.py...but I like to have the core sources clean. During testing I found a problem with test/packaging/rpm/cleanup.py. It will usually PASS, but every now and then when repeatedly calling python runtest.py test/packaging/rpm/cleanup.py it fails. I couldn't find an underlying scheme yet, will try further, but is anyone else seeing this behaviour on his machine...or has an idea what could go wrong? Regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
On Thu, Aug 6, 2015 at 6:01 PM, Dirk Bächle tshor...@gmx.de wrote: William, On 06.08.2015 20:59, William Blevins wrote: Dirk, I don't see information on the hardware and/or threads used for your profile attachment. Also, it is interesting that the update time slightly increased. I assume this is a side-effect of the lazy loading overhead. I am not worried about it though. Good work :) thanks for your feedback. I could list things like OS/machine/RAM...but this is all about the relative comparison of one commit to the other. In both runs the same machine and settings were used...does that help, or do you need more information? I was curious to the thread count due to the slight reduction in full build time. I imagine that savings grows with the number of threads since the bottleneck here was the taskmaster forking new threads (and the memcopy happening per fork). I was just curious because I noticed a small savings in my builds at work. We already have posix spawn wrapper changes on the list of upcoming work, but it might be a point of data for optimizing SCons further by examining work that happens in a single threaded context that *could* possibly be pushed elsewhere? I know that the java emitter code falls into this category, but that's a separate issue. @all: I just pushed a new commit for fixing a few tests/tools that were still accessing the Node attributes directly, mainly .abspath and .path. Doesn't show up usually, because the __getattr__ is in place in Node/FS.py...but I like to have the core sources clean. During testing I found a problem with test/packaging/rpm/cleanup.py. It will usually PASS, but every now and then when repeatedly calling python runtest.py test/packaging/rpm/cleanup.py it fails. I couldn't find an underlying scheme yet, will try further, but is anyone else seeing this behaviour on his machine...or has an idea what could go wrong? I will try to give it a skim when I get home. Regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
On Thu, Aug 6, 2015 at 9:23 PM, William Blevins wblevins...@gmail.com wrote: I ran that test in a loop for 10m or so and never got a failure though it might only happen when you thread it with other tests? I see two potential issues: 1. WhereIs('rpm') vs WhereIs('rpmbuild'); those two processes have been split out for a very long time. 2. If rpm_build_root is not unique then it could conflict with the other rpmbuild tests. I tried running all the rpm tests with -j6 in a loop; again there were no errors, so I don't know for sure. Do you get a stack trace or something? V/R, William On Thu, Aug 6, 2015 at 6:55 PM, William Blevins wblevins...@gmail.com wrote: On Thu, Aug 6, 2015 at 6:01 PM, Dirk Bächle tshor...@gmx.de wrote: William, On 06.08.2015 20:59, William Blevins wrote: Dirk, I don't see information on the hardware and/or threads used for your profile attachment. Also, it is interesting that the update time slightly increased. I assume this is a side-effect of the lazy loading overhead. I am not worried about it though. Good work :) thanks for your feedback. I could list things like OS/machine/RAM...but this is all about the relative comparison of one commit to the other. In both runs the same machine and settings were used...does that help, or do you need more information? I was curious to the thread count due to the slight reduction in full build time. I imagine that savings grows with the number of threads since the bottleneck here was the taskmaster forking new threads (and the memcopy happening per fork). I was just curious because I noticed a small savings in my builds at work. We already have posix spawn wrapper changes on the list of upcoming work, but it might be a point of data for optimizing SCons further by examining work that happens in a single threaded context that *could* possibly be pushed elsewhere? I know that the java emitter code falls into this category, but that's a separate issue. @all: I just pushed a new commit for fixing a few tests/tools that were still accessing the Node attributes directly, mainly .abspath and .path. Doesn't show up usually, because the __getattr__ is in place in Node/FS.py...but I like to have the core sources clean. During testing I found a problem with test/packaging/rpm/cleanup.py. It will usually PASS, but every now and then when repeatedly calling python runtest.py test/packaging/rpm/cleanup.py it fails. I couldn't find an underlying scheme yet, will try further, but is anyone else seeing this behaviour on his machine...or has an idea what could go wrong? I will try to give it a skim when I get home. Regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
I ran that test in a loop for 10m or so and never got a failure though it might only happen when you thread it with other tests? I see two potential issues: 1. WhereIs('rpm') vs WhereIs('rpmbuild'); those two processes have been split out for a very long time. 2. If rpm_build_root is not unique then it could conflict with the other rpmbuild tests. V/R, William On Thu, Aug 6, 2015 at 6:55 PM, William Blevins wblevins...@gmail.com wrote: On Thu, Aug 6, 2015 at 6:01 PM, Dirk Bächle tshor...@gmx.de wrote: William, On 06.08.2015 20:59, William Blevins wrote: Dirk, I don't see information on the hardware and/or threads used for your profile attachment. Also, it is interesting that the update time slightly increased. I assume this is a side-effect of the lazy loading overhead. I am not worried about it though. Good work :) thanks for your feedback. I could list things like OS/machine/RAM...but this is all about the relative comparison of one commit to the other. In both runs the same machine and settings were used...does that help, or do you need more information? I was curious to the thread count due to the slight reduction in full build time. I imagine that savings grows with the number of threads since the bottleneck here was the taskmaster forking new threads (and the memcopy happening per fork). I was just curious because I noticed a small savings in my builds at work. We already have posix spawn wrapper changes on the list of upcoming work, but it might be a point of data for optimizing SCons further by examining work that happens in a single threaded context that *could* possibly be pushed elsewhere? I know that the java emitter code falls into this category, but that's a separate issue. @all: I just pushed a new commit for fixing a few tests/tools that were still accessing the Node attributes directly, mainly .abspath and .path. Doesn't show up usually, because the __getattr__ is in place in Node/FS.py...but I like to have the core sources clean. During testing I found a problem with test/packaging/rpm/cleanup.py. It will usually PASS, but every now and then when repeatedly calling python runtest.py test/packaging/rpm/cleanup.py it fails. I couldn't find an underlying scheme yet, will try further, but is anyone else seeing this behaviour on his machine...or has an idea what could go wrong? I will try to give it a skim when I get home. Regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
Sounds like a plan :) On Aug 6, 2015 3:15 AM, Dirk Bächle tshor...@gmx.de wrote: Hi there, just wanted to let you know that I have prepared a new trunk (default) with the change to slots in the core sources on my local machine. Full testsuite (runtest.py -a) shows no errors on my side...and I'm currently running it through the scons_testsuite ( https://bitbucket.org/dirkbaechle/scons_testsuite) to see if real projects might be affected somehow. My plan is to push these changes to the main repo tonight (23-?? UTC), unless someone objects... Best regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
On 06.08.2015 16:26, Bill Deegan wrote: Dirk, Push when you are ready. Ahhh, tsss...pushed it ( https://www.youtube.com/watch?v=cMBh8P1m9Wo )! Only for the sexy people... :) Hi, the slots changes are online now. For all people interested, please find a comparison of before vs after the merge attached. Best regards, Dirk -Bill On Thu, Aug 6, 2015 at 6:04 AM, William Blevins wblevins...@gmail.com mailto:wblevins...@gmail.com wrote: Sounds like a plan :) On Aug 6, 2015 3:15 AM, Dirk Bächle tshor...@gmx.de mailto:tshor...@gmx.de wrote: Hi there, just wanted to let you know that I have prepared a new trunk (default) with the change to slots in the core sources on my local machine. Full testsuite (runtest.py -a) shows no errors on my side...and I'm currently running it through the scons_testsuite (https://bitbucket.org/dirkbaechle/scons_testsuite) to see if real projects might be affected somehow. My plan is to push these changes to the main repo tonight (23-?? UTC), unless someone objects... Best regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org mailto:Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org mailto:Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev Title: Comparing slotmerge_ref to slotmerge_done wonderbuild Times Runrun [s]update [s]update_implicit [s] Previous1043.716.916.6 Current1030.618.117.7 Factor0.991.071.06 Memory Runrun [MByte]update [MByte] Previous364.0337.4 Current243.3211.8 Factor0.670.63 Profiling Runrun [s]update [s]update_implicit [s] Previous1137.426.326.3 Current1064.928.327.8 Factor0.941.081.06 bombono Times Runrun [s]update [s]update_implicit [s] Previous364.56.94.4 Current362.77.14.6 Factor1.001.031.04 Memory Runrun [MByte]update [MByte] Previous137.3132.5 Current117.4111.7 Factor0.850.84 Profiling Runrun [s]update [s]update_implicit [s] Previous392.711.27.3 Current378.912.18.0 Factor0.961.081.09 mongo Times Runrun [s]update [s]update_implicit [s] Previous3518.942.322.8 Current3500.843.222.8 Factor0.991.021.00 Memory Runrun [MByte]update [MByte] Previous360.3359.5 Current290.4290.7 Factor0.810.81 Profiling Runrun [s]update [s]update_implicit [s] Previous3571.462.934.6 Current3539.064.036.2 Factor0.991.021.04 ascend Times Runrun [s]update [s]update_implicit [s] Previous30.62.31.7 Current30.82.31.8 Factor1.011.021.04 Memory Runrun [MByte]update [MByte] Previous99.2100.2 Current86.187.1 Factor0.870.87 Profiling Runrun [s]update [s]update_implicit [s] Previous32.03.32.5 Current32.43.42.7 Factor1.011.021.06 sconsbld Times Runrun [s]update [s]update_implicit [s] Previous422.930.321.5 Current385.132.123.0 Factor0.911.061.07 Memory Runrun [MByte]update [MByte] Previous395.7416.9 Current252.6258.0 Factor0.640.62 Profiling Runrun [s]update [s]update_implicit [s] Previous515.946.332.3 Current474.850.936.3 Factor0.921.101.12 questfperf Times Runrun [s]update [s]update_implicit [s] Previous964.421.514.4 Current940.322.915.5 Factor0.971.061.08 Memory Runrun [MByte]update [MByte] Previous299.7307.7 Current205.5213.8 Factor0.690.69 Profiling Runrun [s]update [s]update_implicit [s] Previous972.131.120.8 Current966.134.523.8 Factor0.991.111.15 lumiera Times Runrun [s]update [s]update_implicit [s] Previous428.16.04.4 Current426.46.24.6 Factor1.001.031.03 Memory Runrun [MByte]update [MByte] Previous121.9128.6 Current107.0110.3 Factor0.880.86 Profiling Runrun [s]update [s]update_implicit [s] Previous459.39.26.8 Current462.59.67.2 Factor1.011.051.06 mapnik Times Runrun [s]update [s]update_implicit [s] Previous889.410.25.5 Current890.210.45.5 Factor1.001.021.00 Memory Runrun [MByte]update [MByte] Previous161.8163.3 Current127.2137.0 Factor0.790.84 Profiling Runrun [s]update [s]update_implicit [s] Previous895.615.58.5 Current900.016.28.8 Factor1.001.041.05 ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
Re: [Scons-dev] Mini announcement: v2.4 is near...
Dirk, I don't see information on the hardware and/or threads used for your profile attachment. Also, it is interesting that the update time slightly increased. I assume this is a side-effect of the lazy loading overhead. I am not worried about it though. Good work :) On Thu, Aug 6, 2015 at 1:22 PM, Dirk Bächle tshor...@gmx.de wrote: On 06.08.2015 16:26, Bill Deegan wrote: Dirk, Push when you are ready. Ahhh, tsss...pushed it ( https://www.youtube.com/watch?v=cMBh8P1m9Wo )! Only for the sexy people... :) Hi, the slots changes are online now. For all people interested, please find a comparison of before vs after the merge attached. Best regards, Dirk -Bill On Thu, Aug 6, 2015 at 6:04 AM, William Blevins wblevins...@gmail.com mailto:wblevins...@gmail.com wrote: Sounds like a plan :) On Aug 6, 2015 3:15 AM, Dirk Bächle tshor...@gmx.de mailto: tshor...@gmx.de wrote: Hi there, just wanted to let you know that I have prepared a new trunk (default) with the change to slots in the core sources on my local machine. Full testsuite (runtest.py -a) shows no errors on my side...and I'm currently running it through the scons_testsuite ( https://bitbucket.org/dirkbaechle/scons_testsuite) to see if real projects might be affected somehow. My plan is to push these changes to the main repo tonight (23-?? UTC), unless someone objects... Best regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org mailto:Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org mailto:Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev
[Scons-dev] Mini announcement: v2.4 is near...
Hi there, just wanted to let you know that I have prepared a new trunk (default) with the change to slots in the core sources on my local machine. Full testsuite (runtest.py -a) shows no errors on my side...and I'm currently running it through the scons_testsuite (https://bitbucket.org/dirkbaechle/scons_testsuite) to see if real projects might be affected somehow. My plan is to push these changes to the main repo tonight (23-?? UTC), unless someone objects... Best regards, Dirk ___ Scons-dev mailing list Scons-dev@scons.org https://pairlist2.pair.net/mailman/listinfo/scons-dev