On 4/19/06, Stepan Mishura [EMAIL PROTECTED] wrote:
On 4/18/06, Mark Hindess wrote:
On 4/13/06, Stepan Mishura wrote:
Hi Mark,
HARMONY-331 was commited to the trunk so is there any chance to have
classlib test suite status emails sent to the commits list?
Ok. I've added
On 4/13/06, Stepan Mishura [EMAIL PROTECTED] wrote:
Hi Mark,
HARMONY-331 was commited to the trunk so is there any chance to have
classlib test suite status emails sent to the commits list?
Ok. I've added notifiers to both our linux and windows jobs that are
doing build and test
On 4/18/06, Mark Hindess wrote:
On 4/13/06, Stepan Mishura wrote:
Hi Mark,
HARMONY-331 was commited to the trunk so is there any chance to have
classlib test suite status emails sent to the commits list?
Ok. I've added notifiers to both our linux and windows jobs that are
doing
Mikhail, you missed the champion!
I've just look to the generated test report: total time is 1143.767 and time
for
tests.api.java.net.Inet6AddressTest.test_getByNameLjava_lang_String is
645.167!!!
I'd exclude this test to speed up test suite run in 2 times.
Thanks,
Stepan.
On 4/12/06, Mikhail
Interesting this true for Linux but not for Windows ...
Thanks,
Stepan
On 4/14/06, Stepan Mishura wrote:
Mikhail, you missed the champion!
I've just look to the generated test report: total time is 1143.767 and
time for
tests.api.java.net.Inet6AddressTest.test_getByNameLjava_lang_String
what could it possibly be doing for 10 minutes?
Stepan Mishura wrote:
Mikhail, you missed the champion!
I've just look to the generated test report: total time is 1143.767 and time
for
tests.api.java.net.Inet6AddressTest.test_getByNameLjava_lang_String is
645.167!!!
I'd exclude this test to
DNS timeouts are usually quite high ... of the order of 30-60 seconds
so a number of these could possibly add up to 10 minutes. Given the
meaningful test name (I wont join that discussion yet!) then this
seems like quite a likely source of problems.
Regards,
Mark.
On 4/14/06, Geir Magnusson Jr
at the end of the run.
Hi Mark,
HARMONY-331 was commited to the trunk so is there any chance to have
classlib test suite status emails sent to the commits list?
Thanks,
Stepan.
The diff to make this change would be so much easier if we'd refactor
the build files - perhaps as I suggested in HARMONY-293
On 4/12/06, Stepan Mishura [EMAIL PROTECTED] wrote:
Hi George,
Your example looks good for me and I think everybody agreed that we should
organize testing to avoid running all tests for each update: if you fix bug
in 'net' module you don't have to run tests say for 'awt' module but if you
LvJimmy,Jing wrote:
Hi:
2006/4/12, Stepan Mishura [EMAIL PROTECTED]:
On 4/11/06, Paulex Yang wrote:
[SNIP]
I've run tests on Linux. They fail on the same assertion:
[junit] Testcase: testReceiveSend_Block_Normal(
Hi,
Thanks for your comments.
Working on the walk first then try to run principle, I will try to
start adding in run.dependants.tests targets to individual modules'
build.xml scripts over the next few days. Text seems like a good
candidate to start.
Best regards,
George
Anton Avtamonov
On 4/12/06, George Harley [EMAIL PROTECTED] wrote:
Hi,
Thanks for your comments.
Working on the walk first then try to run principle, I will try to
start adding in run.dependants.tests targets to individual modules'
build.xml scripts over the next few days. Text seems like a good
candidate
I think the dependencies for a module belong under the module
directory not at the top level but I agree it might be nice to find a
way to pull them out of the build.xml files.
Regards,
Mark.
On 4/12/06, Anton Avtamonov [EMAIL PROTECTED] wrote:
On 4/12/06, George Harley [EMAIL PROTECTED]
I've fixed tests that change security manager and set of providers,
now the fork mode for the security tests is once, run time reduced
by ~7 minutes.
So now there are other champions, should we go through them and
reduce/exclude/move to another suite?
Thanks,
Mikhail
2006/4/11, Mark Hindess
On 4/12/06, Mikhail Loenko [EMAIL PROTECTED] wrote:
I've fixed tests that change security manager and set of providers,
now the fork mode for the security tests is once, run time reduced
by ~7 minutes.
Yippee! Thanks Mikhail. This makes testing much easier.
Regards,
Mark.
--
Mark Hindess
Hi Anton,
It is a good idea if it helps developers build up a map of how the class
library modules relate to one another. I am not currently sure how such
a top level description file can be incorporated into the kind of scheme
I was thinking about. I will certainly look for opportunities to
Why must we testing the hostname? Why not test the actual address,
127.0.0.1? That should be much more reliable and less platform
dependent.
Regards,
Mark.
On 4/12/06, LvJimmy,Jing [EMAIL PROTECTED] wrote:
Hi:
2006/4/12, Stepan Mishura [EMAIL PROTECTED]:
On 4/11/06, Paulex Yang wrote:
Top 12 winners out of 758 tests (1.6%) according to junit report
take 458 sec out of 572 (80%)
The winners are:
IdentityScopeTest 112.031
IdentityTest85.233
Inet6AddressTest57.944
NTLoginModuleTest 55.199
OutOfMemoryErrorTest30.724
SignatureTest
Mark Hindess wrote:
Why must we testing the hostname? Why not test the actual address,
127.0.0.1? That should be much more reliable and less platform
dependent.
Regards,
Mark.
Sounds good to me.
Paulex, in your view would it compromise the worth of this test if
instead of doing :
--
Stepan Mishura wrote:
I run tests on SUSE LINUX ES 9, file /etc/hosts contains the next entry:
127.0.0.1 localhost
I wonder if we should start logging the different platforms people are
testing/working on so we get a sense of coverage
geir
Anton Avtamonov wrote:
On 4/12/06, Stepan Mishura [EMAIL PROTECTED] wrote:
Hi George,
Your example looks good for me and I think everybody agreed that we should
organize testing to avoid running all tests for each update: if you fix bug
in 'net' module you don't have to run tests say for
On 4/12/06, Geir Magnusson Jr [EMAIL PROTECTED] wrote:
Why have a static file? Why not just generate from the modules
manifests? Then the error can only be in one place - the dependent
module... right?
If they have enough info - of course! That would be excellent.
I suppose it is a bit not
BTW AFAIK we do not test whether manifests list all imports/exports
So if we create data from manifests then it could be insufficient and
we might have the same failures we had yesterday.
Thanks,
Mikhail
2006/4/12, Anton Avtamonov [EMAIL PROTECTED]:
On 4/12/06, Geir Magnusson Jr [EMAIL
Well, you will need to build-up dependency info anyway, right?
Therefore probably that would be a good chance to revise and update
manifests :-)?
--
Anton Avtamonov,
Intel Middleware Products Division
On 4/12/06, Mikhail Loenko [EMAIL PROTECTED] wrote:
BTW AFAIK we do not test whether manifests
Mikhail,
While that is true, maintaining separate dependency information is
just as (un)likely to be correct. I think we should use the manifests
and fix them if they are inaccurate.
Regards,
Mark.
On 4/12/06, Mikhail Loenko [EMAIL PROTECTED] wrote:
BTW AFAIK we do not test whether
Well I did not suggest maintaining separate set of manifests :)
Can we somehow test validity of [existing] manifests?
Thanks,
Mikhail
2006/4/12, Mark Hindess [EMAIL PROTECTED]:
Mikhail,
While that is true, maintaining separate dependency information is
just as (un)likely to be correct. I
Mikhail Loenko wrote:
BTW AFAIK we do not test whether manifests list all imports/exports
So if we create data from manifests then it could be insufficient and
we might have the same failures we had yesterday.
Yes... but we already have the manifests. They have to be right for OSGi.
So
2006/4/12, Stepan Mishura [EMAIL PROTECTED]:
On 4/11/06, Paulex Yang wrote:
[SNIP]
I've run tests on Linux. They fail on the same assertion:
[junit] Testcase: testReceiveSend_Block_Normal(
org.apache.harmony.tests.java.nio.channels.DatagramChannelTest):
FAILED
George Harley wrote:
Mark Hindess wrote:
Why must we testing the hostname? Why not test the actual address,
127.0.0.1? That should be much more reliable and less platform
dependent.
Regards,
Mark.
Sounds good to me.
Paulex, in your view would it compromise the worth of this test if
Yes. I was using the failureproperty mechanism. Trying to get this
property propogated back to the top level ant file was what I was
having trouble with.
Using a file as you suggest might help. I'll give that a try shortly...
Incidentally, I'm seeing 12 failures and 3 errors on r393111. (And
On 4/11/06, Mark Hindess wrote:
Yes. I was using the failureproperty mechanism. Trying to get this
property propogated back to the top level ant file was what I was
having trouble with.
Using a file as you suggest might help. I'll give that a try shortly...
Incidentally, I'm seeing 12
Stepan Mishura wrote:
Hi,
I've checked out at morning last updates, built the code base and run the
tests …and there are 24 tests failures!
There are 9 tests failures in
org.apache.harmony.tests.java.nio.channels.DatagramChannelTest – I saw these
failures before from time to time. It seems
No. These:
F
org.apache.harmony.security.asn1.der.DerGeneralizedTimeEDTest.testGeneralizedEncoder
E
org.apache.harmony.security.asn1.der.DerGeneralizedTimeEDTest.testGeneralizedEncoderDecoder01
E
org.apache.harmony.security.asn1.der.DerGeneralizedTimeEDTest.testGeneralizedEncoderDecoder02
F
The same for me + DatagramChannelTest
Thanks,
Stepan.
On 4/11/06, Mark Hindess wrote:
No. These:
F
org.apache.harmony.security.asn1.der.DerGeneralizedTimeEDTest.testGeneralizedEncoder
E
org.apache.harmony.security.asn1.der.DerGeneralizedTimeEDTest.testGeneralizedEncoderDecoder01
E
Hi,
It *seems* like things started failing after I committed the changes for
HARMONY-205 last night. I'm looking into this now. If the investigation
begins to take up too much time I will back the changes out.
Best regards,
George
Stepan Mishura wrote:
The same for me +
Mark Hindess wrote:
George is taking a look at the ones I mentioned - which are from our
Linux build.
I see quite a few more (including DatagramChannelTest) on our windows
build.
Our internal Windows build has not been clean for a while now. I am
currently not looking at anything beyond the
Mark Hindess wrote:
Yes. I was using the failureproperty mechanism. Trying to get this
property propogated back to the top level ant file was what I was
having trouble with.
I had the same problem. I wanted the whole thing to halt on any
failure, and it didn't work... the top level ant
Just curious (and this isn't a criticism - I'm just as guilty of not
doing this)...
Don't you run the tests before committing?
geir
George Harley wrote:
Hi,
It *seems* like things started failing after I committed the changes for
HARMONY-205 last night. I'm looking into this now. If the
I've submitted a JIRA containing a fix that is something close to what
Stepan suggested. Having module test targets append the name of the
module to a build/test_report/test.errors (or test.failures) and the
top level target fails if those files exist at the end of the run.
The diff to make this
Geir Magnusson Jr wrote:
Just curious (and this isn't a criticism - I'm just as guilty of not
doing this)...
Don't you run the tests before committing?
Hi Geir,
Depends what do you mean by the tests.
The change was completely encapsulated in the text component. I ran the
text tests. I
On 4/11/06, Paulex Yang wrote:
Stepan Mishura wrote:
Hi,
I've checked out at morning last updates, built the code base and run
the
tests …and there are 24 tests failures!
There are 9 tests failures in
org.apache.harmony.tests.java.nio.channels.DatagramChannelTest – I saw
these
On 4/11/06, Mark Hindess wrote:
Personally, obviously, I'd expect people to run the tests before
committing.
However, I notice that since enabling the security tests - which fork
for every test - that the tests take over half an hour to run now on
our Linux build machine. So I can see why
So the answer is no?
George Harley wrote:
Geir Magnusson Jr wrote:
Just curious (and this isn't a criticism - I'm just as guilty of not
doing this)...
Don't you run the tests before committing?
Hi Geir,
Depends what do you mean by the tests.
The change was completely encapsulated in the
Stepan Mishura wrote:
On 4/11/06, Mark Hindess wrote:
Personally, obviously, I'd expect people to run the tests before
committing.
However, I notice that since enabling the security tests - which fork
for every test - that the tests take over half an hour to run now on
our Linux build
I forgot the smiley.
I don't think this problem is so odd. Do you really think that
side-effects like this will be that rare?
geir
Geir Magnusson Jr wrote:
So the answer is no?
George Harley wrote:
Geir Magnusson Jr wrote:
Just curious (and this isn't a criticism - I'm just as guilty of
Hi Geir,
From my point of view the answer is yes, I ran the tests for the
patched module. The build/test server picked up the downstream breakages
in the other modules. I fixed matters so that the builds ran again.
Elsewhere in this thread I am trying to cooperate with others to make it
George Harley wrote:
Hi Geir,
From my point of view the answer is yes, I ran the tests for the
patched module. The build/test server picked up the downstream breakages
in the other modules. I fixed matters so that the builds ran again.
Right - I thought we considered the build/test
Stepan Mishura wrote:
On 4/11/06, Paulex Yang wrote:
Stepan Mishura wrote:
Hi,
I've checked out at morning last updates, built the code base and run
the
tests …and there are 24 tests failures!
There are 9 tests failures in
Geir Magnusson Jr wrote:
I forgot the smiley.
:-)
I don't think this problem is so odd. Do you really think that
side-effects like this will be that rare?
Seriously, I don't really know. The case we have been discussing today
was the first concrete example that I have encountered and I
Geir Magnusson Jr wrote:
George Harley wrote:
Hi Geir,
From my point of view the answer is yes, I ran the tests for the
patched module. The build/test server picked up the downstream
breakages in the other modules. I fixed matters so that the builds
ran again.
Right - I thought we
George Harley wrote:
Geir Magnusson Jr wrote:
George Harley wrote:
Hi Geir,
From my point of view the answer is yes, I ran the tests for the
patched module. The build/test server picked up the downstream
breakages in the other modules. I fixed matters so that the builds
ran again.
On 4/11/06, Geir Magnusson Jr [EMAIL PROTECTED] wrote:
George Harley wrote:
Geir Magnusson Jr wrote:
George Harley wrote:
Hi Geir,
From my point of view the answer is yes, I ran the tests for the
patched module. The build/test server picked up the downstream
breakages in the
Mark Hindess wrote:
On 4/11/06, Geir Magnusson Jr [EMAIL PROTECTED] wrote:
There is a cultural element to it - maybe we keep tabs on who breaks the
build, and that decides who buys the beer next time we all meet...
Not that I object to George buying me beer, but this seems a little
harsh
So it really happens... platform dependency, or platform-release dependenecy
make things in mess.
In my opinion, George Harley's method of grouping test above and in the
threads of matching reference implementation exception behaviour is fairly
good.
And another serious problem leaves here: how
In my opinion, the tests are independent from each other. So every test
start up with its own environment, and release the resources when finished.
In this way, there's no dependancy chain and no mess at all.
Yes, we've met serveral side effect testcases, how about making a list to
warn the
On 4/11/06, Paulex Yang wrote:
[SNIP]
I've run tests on Linux. They fail on the same assertion:
[junit] Testcase: testReceiveSend_Block_Normal(
org.apache.harmony.tests.java.nio.channels.DatagramChannelTest):
FAILED
[junit] expected:... but was:localdomain
[junit]
Hi George,
Your example looks good for me and I think everybody agreed that we should
organize testing to avoid running all tests for each update: if you fix bug
in 'net' module you don't have to run tests say for 'awt' module but if you
update 'luni' then you have to run tests for all modules.
Hi:
2006/4/12, Stepan Mishura [EMAIL PROTECTED]:
On 4/11/06, Paulex Yang wrote:
[SNIP]
I've run tests on Linux. They fail on the same assertion:
[junit] Testcase: testReceiveSend_Block_Normal(
org.apache.harmony.tests.java.nio.channels.DatagramChannelTest ):
FAILED
Hi,
I've checked out at morning last updates, built the code base and run the
tests …and there are 24 tests failures!
There are 9 tests failures in
org.apache.harmony.tests.java.nio.channels.DatagramChannelTest – I saw these
failures before from time to time. It seems that tests depend on some
59 matches
Mail list logo