2010/11/5 Peter Firmstone <j...@zeus.net.au>

> That's, quite an achievement for all involved, is that all the tests?
>

We have enabled most known QA test categories now. There are some left out
there, but mostly empty ones.

Some of the QA tests were deliberately set to be ignored, like the Kerberos
tests (also bc we need a KDC for that).


> For jtreg it's about time I got the KDC server set up, not sure what to do
> for a squid proxy server though, do you think it would be safe to run squid
> on the same server with the KDC, seeing as they're there mostly for tests?
>

I guess we need to pose the question to INFRA: if we need external
infrastructure in place for our tests, like a KDC or a proxy server, what
would be the best way to set that up, if at all possible? I believe an
initial discussion took place at one time, but not sure what conclusion was
drawn from that (if any).
Setting up a separate zone just for a KDC seems like overkill to me, but I'm
not a system admin. Also considering the fact that any client (the Hudson
builders) would need Kerberos client software installed anyway.
On my machine, I installed and configured a KDC (there is a JIRA ticket
explaining what I did for that, on Ubuntu), and I am running the tests from
the same machine (pointing QA config to my KDC). That seems to work fine.

An additional hurdle for the jtreg suite is the jtreg software that would
need to be installed on the Hudson builders. Jtreg was originally meant to
regression test the JDK, and because JERI & co was originally intended to be
included in the standard Java spec (see JSR-76 and JSR-78), these jtreg
tests came to exist ... Things changed dramatically after both JSRs were
rejected, though.
Maybe we can migrate the valuable tests in that jtreg suite to either JUnit
or QA tests? I am aware there are caveats (test isolation level for
instance), but they seem to be manageable. I'm talking about a gradual
process here, converting test after test over a long period of time. We are
talking about around 100 jtreg tests, with varying complexity and isolation
levels.

Reducing the number of ways to test things is probably also good for general
understanding :-)



> Regards,
>
> Peter.
>
>
> Jonathan Costers wrote:
>
>> Hooray!
>>
>> All 1409 tests passed on ubuntu, taking 17hrs to run:
>>
>>
>>     [java]
>>     [java] # of tests started   = 1409
>>     [java] # of tests completed = 1409
>>     [java] # of tests skipped   = 47
>>     [java] # of tests passed    = 1409
>>     [java] # of tests failed    = 0
>>     [java]
>>     [java] -----------------------------------------
>>     [java]
>>     [java]    Date finished:
>>     [java]       Thu Nov 04 03:00:44 UTC 2010
>>     [java]    Time elapsed:
>>     [java]       62200 seconds
>>
>>
>> Thanks to all for getting RIVER-301 out of the way!
>>
>> At this point, I believe we are sufficiently armed to validate some of the
>> interesting proposals, improvements and experiments that have been posted
>> to
>> this list the last couple of months.
>>
>> Closing RIVER-301 (which had been open/in progress for a very long time)
>> is
>> also another step towards graduation from the incubator, since one of the
>> main goals was to setup a testing framework.
>>
>> 2010/11/4 Apache Hudson Server <hud...@hudson.apache.org>
>>
>>
>>
>>> See <https://hudson.apache.org/hudson/job/River-trunk-QA/51/changes>
>>>
>>>
>>>
>>>
>>>
>>
>>
>>
>
>

Reply via email to