On 25/04/06, Mark McWhinney <[EMAIL PROTECTED]> wrote:
> I have a load test project for a high-traffic web site.
>
> Each JMeter script on each load server produces an XML file from a Simple
> Data Writer listener.  Each of these files is between 100,000 and 500,000
> lines long.  Files this size choke Excel.
>
> Are there other tools for analyzing these big XML files?

In theory one can use XSLT...

You can use Perl on CSV files - depends what you want to do.

> Is there a way to produce a smaller output file?

Best to create CSV files unless you need some of the additional fields
that are only saved in XML files.

You can also save "errors only", but that probably won't help here.

You may be able to combine several samplers using the Transaction Controller.

It's not possible to save only the Transaction Samples, but by
suitable choice of naming convention you could use a script to filter
out the lower-level results from the output file.

If you are only interested in summary details, try the Summary Post-Processor

It currently writes to jmeter.log or standard output only.

You could perhaps combine that with "errors only" for a bit more information.
And maybe add some assertions to generate errors for samples that take too long.

The nightly build includes a BeanShell Listener - if you are prepared
to write some Java, you could perhaps use that to do some result
accumulation.

S.

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to