On 05/23/2012 12:02 PM, Bruce Dubbs wrote:
> Bruce Dubbs wrote:
>
>> The second time through the build/test not going through ssh seems to
>> have a lot fewer errors but I've probably still got a couple of hours to
>> go.  I'll give a final result tomorrow.
> OK, here are some results:
>
> Sources directory     195M
> Build directory:      6.8G  (at one time grew to at least 7.2G)
> /opt/icedtea-bin      423M
> /opt/OpenJDK-1.7.0.3  422M
> rhino1_7R3             17M
> /usr/share/java       2.5M
>
> Build time (minutes) 74:35  ( 44 SBU)
> Test time (minutes) 216:59  (129 SBU)
>
> Internally in the log I found:
>
> -- Build times ----------
> Target all_product_build
> Start 2012-05-22 21:16:55
> End   2012-05-22 22:09:15
> 00:06:56 corba
> 00:07:29 hotspot
> 00:00:18 jaxp
> 00:01:06 jaxws
> 00:35:52 jdk
> 00:00:39 langtools
> 00:52:20 TOTAL
>
>
> -- Build times ----------
> Target all_product_build
> Start 2012-05-22 22:09:22
> End   2012-05-22 22:30:12
> 00:01:38 corba
> 00:07:10 hotspot
> 00:00:20 jaxp
> 00:00:24 jaxws
> 00:10:41 jdk
> 00:00:37 langtools
> 00:20:50 TOTAL
>
> -- Build times ----------
> Target all_product_build
> Start 2012-05-22 22:30:51
> End   2012-05-22 22:33:01
> 00:00:02 corba
> 00:00:07 hotspot
> 00:00:03 jaxp
> 00:00:07 jaxws
> 00:01:46 jdk
> 00:00:05 langtools
> 00:02:10 TOTAL
>
> -- Build times ----------
> Target all_product_build
> Start 2012-05-22 22:33:03
> End   2012-05-22 22:35:24
> 00:00:04 corba
> 00:00:11 hotspot
> 00:00:03 jaxp
> 00:00:05 jaxws
> 00:01:54 jdk
> 00:00:04 langtools
> 00:02:21 TOTAL
>
> Test results: passed: 144
> Report written to test/hotspot/JTreport/html/report.html
>
> Test results: passed: 3,959; failed: 136; error: 10
> Report written to test/jdk/JTreport/html/report.html
>
> Test results: passed: 1,920; failed: 1
> Report written to test/langtools/JTreport/html/report.html
>
> ===============
>
> The number of hotspot and langtools failures are the same as DJ's, but I
> got 136 failures in the jdk where DJ only got 78.  In particular, I got
> 28 failures in com/sun/jdi/ where DJ got none.
Hmm...JDI is the debug interface. I don't have an immediate explanation 
for that. Do you have traditional debugging tools available in the 
environment? They should not have an effect, but at the time I tested, I 
did not have them available.

> I also gor 29 failures
> in java/awt/ where DJ got 12.

This could be due to the window manager. I always sit and watch it 
through the tests. There were a couple where my login window was to 
blame when it was trying to iconify and restore windows. Also, root or 
unprivileged user? My results were obtained as an unprivileged user, 
using TWM on a fairly minimal system (just enough to meet required and 
recommended deps).

On a side note, 2.2 is scheduled for release next Wednesday (5/30). I 
don't anticipate any changes in the options or dependencies. 2.2 will 
basically be the same as the Oracle 7u4 with some of the security 
updates that are slotted for 7u5, along with the typical build fixes for 
newer system software and the closed parts of the software replaced by 
free software (easily arguable as _better_ replacements (pulse for the 
old sgi audio for example)). Basically, just disregard the upstream 
fixes patch and build as before. Existing 2.1 builds should work fine as 
a bootstrap compiler, however, we'll be pulling the downloads from git 
again (make download if you have wget installed, the all target will 
also get them for you in one step).
> I don't know if these differences is due to different HW (e.g. cheap
> video), installed packages, or are even due to kernel configuration.
> For instance I got an error in NoUpdateUponShow where DJ did not:
>
> JavaTest Message: Test threw exception:
> sun.awt.SunToolkit$OperationTimedOut: 10001
>
> I did take some pains to ensure the screensaver did not kick in.
>
> ==============
>
> The open question is how we should address the test failures in the book.
>

Unfortunately, until I hear back from the other maintainers on their 
test suite results, I really have nothing to go on. I know one of the 
core ITW developers had mentioned that he was working on a public repo 
for test results in private message, but as I understand it, he has 
about as much free time as I do lately. :-) Perhaps I should just 
install Fedora or Debian and build from their respective sources to make 
a meaningful comparison. In the grand scheme of things, the numbers are 
small in comparison to the number of tests, and not that different from 
what is in the book now which was deemed OK previously (by me using 
comparison with the other distros)...at least once the mystery is solved 
about the JDI tests. Maybe we should consider dropping the -samevm flag 
as one failure could conceivably cause a whole group of tests to fail if 
the running environment gets trashed. The downside to that is that a new 
VM is created for each test, which has a rather obvious effect on the 
wall clock.

-- DJ Lucas


-- 
This message has been scanned for viruses and
dangerous content, and is believed to be clean.

-- 
http://linuxfromscratch.org/mailman/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page

Reply via email to