I have played around with this. 

I am using 2 WebTest client machines to reduce the overhead, each of them
starting 16 instances of WebTest with the same script.
In config.xml I specify to save a summary but not the results (writing a
html file for each page seems to slow execution down a bit) .
Then I end up with 16 directories on each of the two "clients", each
directory containing a WebTestReport.xml.
The following grep/cut works in bash (I tried on Kubuntu, but should work on
other Linux flavours as well) and basically extracts the description and the
time in milliseconds for all steps that have a description.

for f in */WebTestReport.xml; do xml2 < $f | tr '\n' ';'|sed -e
"s#/summary/testresult/results/step;#\n#g"|grep description|grep -v
Implementation|sed -e "s#/summary/testresult/results/step/##g"|tr '='
';'|cut -d ';' -f 2,4|sort; done|sort

Then I do some aggregation/averaging in a spreadsheet.
Are others using a similar approach?

cheers
Ivan


Marc Guillemot wrote:
> 
> Hi,
> 
> until yesterday I was convinced that WebTest was not an appropriate tool 
> for load testing. A mail from Serban Balamaci on htmlunit's user mailing 
> list seems to show that I was perhaps wrong:
> 

-- 
View this message in context: 
http://www.nabble.com/Load-test-with-WebTest-possible--tp8648127p20319554.html
Sent from the WebTest mailing list archive at Nabble.com.

_______________________________________________
WebTest mailing list
[email protected]
http://lists.canoo.com/mailman/listinfo/webtest

Reply via email to