Re: Two especially slow performance tests on trunk

2018-05-14 Thread Mark Thomas
On 14/05/18 09:58, Rainer Jung wrote:
> Am 11.05.2018 um 15:23 schrieb Mark Thomas:
>> On 11/05/18 10:17, Rainer Jung wrote:
>>> Running the unit tests for trunk on my relatively slow Solaris machine,
>>> two of the performance tests run especially long:
>>>
>>> javax.websocket.TestContainerProviderPerformance: about 25 minutes per
>>> connector.
>>>
>>> org.apache.jasper.runtime.TestTagHandlerPoolPerformance: about 5 minutes
>>> per connector.
>>
>> These are both intended to be run outside of the unit tests. They are
>> they to check performance when experimenting with different approaches.
>> I'd suggest renaming them to Tester... so they are not included in the
>> user tests.
> 
> That would at least scratch my itch :)
> 
> For the sake of consistency: the following trunk tests are of type
> Performance.java but use the "Test" naming instead of "Tester" (longest
> running first):
> 
> Test Duration(ms) on slow machine
> *org.apache.juli.TestOneLineFormatterPerformance 80355
> *org.apache.catalina.webresources.TestAbstractFileResourceSetPerformance
> 12779
> +org.apache.catalina.connector.TestResponsePerformance 7233
> *javax.servlet.jsp.el.TestScopedAttributeELResolverPerformance 392
> 
> The ones marked with "*" do not contain any test assertion, the ones
> with "+" have at least one. Any of these 5 tests that you would put into
> the same "Tester" reasoning (intended to be run outside of the unit tests)?

The ones without any assertions are good candidates.

> +org.apache.catalina.mapper.TestMapperPerformance 16856

This is there primarily as a safety check in case we make a change to
the Mapper that significantly impacts performance without noticing. I'd
leave it as is. I wonder about reducing the threshold below the current
5000ms. Gump is normally one of the slowest machines we run the tests
on. The current run hasn't reached this test yet but it would be worth a
look to see how long it is taking.

> Alternatively at least for TestOneLineFormatterPerformance we could add
> a speed comparison assertion for the two implementations that this test
> measures. On my slow machine, the faster impl is 10 times as fast, so an
> expected factor of 2 might be safe to test against, at least worth a try.

No objections.

The others we could potentially restructure that way with a little
imagination. To be honest they are mainly there because I wrote them to
test relative performance when working on performance bugs and I didn't
want to just throw them away.

Mark

-
To unsubscribe, e-mail: dev-unsubscr...@tomcat.apache.org
For additional commands, e-mail: dev-h...@tomcat.apache.org



Re: Two especially slow performance tests on trunk

2018-05-14 Thread Rainer Jung

Am 11.05.2018 um 15:23 schrieb Mark Thomas:

On 11/05/18 10:17, Rainer Jung wrote:

Running the unit tests for trunk on my relatively slow Solaris machine,
two of the performance tests run especially long:

javax.websocket.TestContainerProviderPerformance: about 25 minutes per
connector.

org.apache.jasper.runtime.TestTagHandlerPoolPerformance: about 5 minutes
per connector.


These are both intended to be run outside of the unit tests. They are
they to check performance when experimenting with different approaches.
I'd suggest renaming them to Tester... so they are not included in the
user tests.


That would at least scratch my itch :)

For the sake of consistency: the following trunk tests are of type 
Performance.java but use the "Test" naming instead of "Tester" (longest 
running first):


Test Duration(ms) on slow machine
*org.apache.juli.TestOneLineFormatterPerformance 80355
+org.apache.catalina.mapper.TestMapperPerformance 16856
*org.apache.catalina.webresources.TestAbstractFileResourceSetPerformance 
12779

+org.apache.catalina.connector.TestResponsePerformance 7233
*javax.servlet.jsp.el.TestScopedAttributeELResolverPerformance 392

The ones marked with "*" do not contain any test assertion, the ones 
with "+" have at least one. Any of these 5 tests that you would put into 
the same "Tester" reasoning (intended to be run outside of the unit tests)?


Alternatively at least for TestOneLineFormatterPerformance we could add 
a speed comparison assertion for the two implementations that this test 
measures. On my slow machine, the faster impl is 10 times as fast, so an 
expected factor of 2 might be safe to test against, at least worth a try.


Thanks and regards,

Rainer


I know that I can disable all performance tests using
test.excludePerformance, but apart from those two all others run
reasonably fast.

In test/javax/websocket/TestContainerProviderPerformance.java, there's
an iteration count, currently 25. Is there a special reason, why it
is so high? Would fast machines still generate a reasonable test result
with something much smaller, like e.g. 1? Phrased differently: how
fast does that test currently run on your machine?

The other test,
test/org/apache/jasper/runtime/TestTagHandlerPoolPerformance.java, uses
an iteration count of 500 and at least on my 2 core test system
doesn't scale well. It takes "just" 5 minutes, but still the question
is, whether we could lower the iteration count maybe to 100 without
making the test useless?

Note that the tests seem to not have any success or failure assertion.

Regards,

Rainer


-
To unsubscribe, e-mail: dev-unsubscr...@tomcat.apache.org
For additional commands, e-mail: dev-h...@tomcat.apache.org



Re: Two especially slow performance tests on trunk

2018-05-11 Thread Mark Thomas
On 11/05/18 10:17, Rainer Jung wrote:
> Running the unit tests for trunk on my relatively slow Solaris machine,
> two of the performance tests run especially long:
> 
> javax.websocket.TestContainerProviderPerformance: about 25 minutes per
> connector.
> 
> org.apache.jasper.runtime.TestTagHandlerPoolPerformance: about 5 minutes
> per connector.

These are both intended to be run outside of the unit tests. They are
they to check performance when experimenting with different approaches.
I'd suggest renaming them to Tester... so they are not included in the
user tests.

Mark


> 
> I know that I can disable all performance tests using
> test.excludePerformance, but apart from those two all others run
> reasonably fast.
> 
> In test/javax/websocket/TestContainerProviderPerformance.java, there's
> an iteration count, currently 25. Is there a special reason, why it
> is so high? Would fast machines still generate a reasonable test result
> with something much smaller, like e.g. 1? Phrased differently: how
> fast does that test currently run on your machine?
> 
> The other test,
> test/org/apache/jasper/runtime/TestTagHandlerPoolPerformance.java, uses
> an iteration count of 500 and at least on my 2 core test system
> doesn't scale well. It takes "just" 5 minutes, but still the question
> is, whether we could lower the iteration count maybe to 100 without
> making the test useless?
> 
> Note that the tests seem to not have any success or failure assertion.
> 
> Regards,
> 
> Rainer
> 
> -
> To unsubscribe, e-mail: dev-unsubscr...@tomcat.apache.org
> For additional commands, e-mail: dev-h...@tomcat.apache.org
> 


-
To unsubscribe, e-mail: dev-unsubscr...@tomcat.apache.org
For additional commands, e-mail: dev-h...@tomcat.apache.org



Two especially slow performance tests on trunk

2018-05-11 Thread Rainer Jung
Running the unit tests for trunk on my relatively slow Solaris machine, 
two of the performance tests run especially long:


javax.websocket.TestContainerProviderPerformance: about 25 minutes per 
connector.


org.apache.jasper.runtime.TestTagHandlerPoolPerformance: about 5 minutes 
per connector.


I know that I can disable all performance tests using 
test.excludePerformance, but apart from those two all others run 
reasonably fast.


In test/javax/websocket/TestContainerProviderPerformance.java, there's 
an iteration count, currently 25. Is there a special reason, why it 
is so high? Would fast machines still generate a reasonable test result 
with something much smaller, like e.g. 1? Phrased differently: how 
fast does that test currently run on your machine?


The other test, 
test/org/apache/jasper/runtime/TestTagHandlerPoolPerformance.java, uses 
an iteration count of 500 and at least on my 2 core test system 
doesn't scale well. It takes "just" 5 minutes, but still the question 
is, whether we could lower the iteration count maybe to 100 without 
making the test useless?


Note that the tests seem to not have any success or failure assertion.

Regards,

Rainer

-
To unsubscribe, e-mail: dev-unsubscr...@tomcat.apache.org
For additional commands, e-mail: dev-h...@tomcat.apache.org