On Fri, Sep 12, 2014 at 9:43 AM, [email protected] <[email protected]> wrote: > Hi devs, > > I think it’s high time we start to control our performances (page load time, > memory consumption, scalability). > > Here’s what I envision: > > * At each release (milestone) we get a report of the following (precise > details to be defined together): > > ** page load times
"page load times" alone is not very clear, should probably be named "page reloading/refresh time" since loading a page for the very first time give very different results than reloading it (very variable stuff put in cache including the page itself if it did not already ended up in the cache before even being explicitly loaded for some reason) > *** empty page with “get” action (ie no skin) > *** empty page with “view” action > *** Dashboard.WebHome with “view” action *** Main.WebHome with "view" action, even if it's mostly the same as Dashboard.WebHome what we really care about is Main.WebHome because that's the first page a user see *** Main.WebHome with "get" action, when we want to test the performance of a specific page 'get' action is more interesting IMO as it's not polluted by UI performances variations > *** <others?> > > ** memory consumption > *** <scenarios to be defined> *** memory after jetty startup *** memory after full SOLR index of a 200 (to be defined) wikis farm (will also catch many possible memory leaks) Accurate memory test on a simple page load is hard I think since a lot may be happening in the background et the same time but refreshing a page 1000 times may be interesting to catch memory leaks. > > ** scalability > *** max TPS (transaction per second) with a defined page navigation scenario, > injected with lots of users and see how the TPS evolves with # of users > *** <precise scenarios to be defined> *** access the first page with various number of wikis *** time spend by SOLR to fully index a 200 (to be defined) wikis farm *** time spend by SOLR to check that there is nothing to index at startup in a 200 (to be defined) wikis farm *** copy a standard subwiki *** delete a page containing various number of attachments with trash *** delete a page containing various number of without trash *** display history of a page having various number of versions *** load an older version of a page having various number of versions > > * Store the results on http://xwiki.org to be able to compare them against > past releases and see how it’s progressing Note that having comparable speed results when you test 1 year after is not very easy (it's not easy to keep a perfectly identical environment) so we might want to store comparison with last "super stable" release for example instead of raw times (which mean redo the tests for the previous version before doing those for the new released version). > > Note that Thomas started by installing the jenkins performance plugin with a > jmeter job on ci.xwiki.org but it’s not enough and more importantly it’s not > used. What we need is a report at each release with a quick analysis of how > it compares with past releases and some suggested actions. The job contain statistics for loading all the pages present in a standard jetty/hsqldb XE instance 10 times with both 'view' and 'get' actions. > > WDYT? > > Thanks > -Vincent > _______________________________________________ > devs mailing list > [email protected] > http://lists.xwiki.org/mailman/listinfo/devs -- Thomas Mortagne _______________________________________________ devs mailing list [email protected] http://lists.xwiki.org/mailman/listinfo/devs

