On 9 February 2012 15:38, Robin D. Wilson <[email protected]> wrote:
> Thanks sebb for the replies...
>
> Here's the deal, I am running the same test script on JM2.4 and JM2.6. I am 
> running in GUI mode. The test script has 3 thread groups
> - but the first and the last thread group is just a 'timer' I created to log 
> the total elapsed time of the test (the first and last
> group has 1 thread, and 1 request, and take less than 1 second each to run). 
> The 'real' test is the middle thread group. It has 100
> threads (0 ramp), and runs 100 iterations (10,000 total samples). It simply 
> does a 'POST' to a URL, with 15
>
> So the 'elapsed time' I referring to in my test is actually the timestamp 
> taken in the first thread group (in ms since epoch)
> subtracted from the timestamp taken in the 3rd (last) thread group. That part 
> of my test may only add 2 total seconds to the test,
> so while it may skew my results slightly - it doesn't explain the vast 
> difference in the 'average' sample duration. According to the
> Summary Report docs, the "Average" is supposed to be "the average elapsed 
> time of a set of samples". But clearly, if the minimum
> time it takes to actually get the page is 2 seconds (due to the built-in 
> delay in the cgi-script), there is no way I could have an
> 'average' elapsed time of less than 2 seconds, yet I'm showing an average 
> elapsed time of ~750 ms... (My "Max" elapsed time shows as
> only 1198!). When I request the page in Firefox, it takes ~2104ms (using a 
> status bar timer), so I think the cgi script is working
> correctly.)
>
> Sebb asked:
>
>>Again, the throughput calculations are based on total test time. Are you sure 
>>the test run times are comparable?
>
> The test run times are automatically calculated by the 1st and 3rd thread 
> groups. The ~210 seconds total elapsed time is accurate
> based on my external measurement too (e.g., it is close to what I can observe 
> with my stopwatch).
>
> Both the JM2.4 test and the JM2.6 test are using the exact same ".jmx" test 
> file.
>
>>There's clearly something else going on here.
>
> I don't believe that the Summary Report is accurately calculating anything 
> except the total number of samples and the Avg. Bytes...

What makes you say that?
Are the Min and Max really incorrect?
Error %?

It's easy enough to check the Summary Results if you can provide the
CSV sample result files.

> The cgi-script I'm using definitely takes 2+ seconds to respond after it gets 
> the request (I've measured this with Firefox directly,
> and it _never_ gets a response in less than 2 seconds). I even changed the 
> 'sleep' to 9 seconds, and JMeter pauses for that long in
> recording results (e.g., it shows 100 threads run, then waits 9 seconds, 
> shows another 100 threads, etc.), but the numbers just go
> up to '1758' Average, and '2415' Max (which is impossible since it is taking 
> 9+ seconds to respond to each request!). It takes over
> 15 minutes to complete 10,000 samples (and that seems about right - 10000 
> samples/100 threads * 9 seconds each = 900 seconds).
>
> I even went so far as to inject a 2 second sleep in the middle of the 
> response (e.g., pause 2 seconds -  send part of the response -
> pause 2 more seconds - send the rest), I'm still getting average times of 
> ~1000 ms. (That's with 4 seconds of built-in delays, and 2
> of those seconds are in the middle of the response.) The browser shows this 
> delay properly, but JMeter isn't calculating it
> properly.
>
>>Please recheck the individual sample response times and see how they compare 
>>to the average.
>
> I'm not sure how to do that in JMeter. I can manually hit the page, and it 
> takes about 100ms longer than the built-in delay I have.

Add a Table View Listener, or just check the CSV sample result files.

>>If there still appears to be a problem, create a Bugzilla issue and attach:
>>- JMX test case
>
> I'm trying to simplify the test case to the bare minimum case - so the 
> results will be indisputable. I will also include the
> 'cgi-bin' script that I'm using, so someone else can easily setup the same 
> test.

Thanks.

>
>>- log files for JMeter 2.4 and 2.6
>
> Which log files are these? Is it just the 'jmeter.log' that gets created in 
> the 'bin' folder when I run the GUI mode, or do you need
> another log file?

jmeter.log

>>- CSV result files for 2.4 and 2.6
>
> I can do this.
>
> --
> Robin D. Wilson
> Sr. Director of Web Development
> KingsIsle Entertainment, Inc.
> VOICE: 512-777-1861
> www.KingsIsle.com
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to