I'm using JMeter client to test the throughtput of a certain workload
(PHP+MySQL, 1 page) on a certain server. Basically I'm doing a "capacity
test" with an increasing number of threads over the time.

I installed the "Statistical Aggregate Report" JMeter plugin and this was
the result (ignore the "Response time" line): [image: enter image
description here]

At the same time I used the "Simple Data Writer" listener to write a log
file ("JMeter.csv"). Then I tried to "manually" calculate the throughput
for every second of the test.

Each line of "JMeter.csv" has this format:

timestamp       elaspedtime   responsecode   success   bytes
1385731020607   42            200            true      325
...             ...           ...            ...       ...

The timestamp is referred to the time when the request is made by the
client, and not when the request is served by the server. So I simply
did: *totaltime
= timestamp + elapsedtime*.

In the next step I converted the *totaltime* to a date format, like:
*13:17:01*.

I have more than 14K samples and with Excel I was able to do this quickly.

Then I counted how many samples there were for each second. Example:

totaltime    samples (requestsServed/second)
13:17:01     204
13:17:02     297
...          ...

When I tried to plot the results I obtained the following graphic: [image:
enter image description here]

As you can notice it is far different from the first graphic.

Given that the first graphic is correct, what is the mistake of my
formula/procedure to calculate the throughput?

Reply via email to