This is my first time using jmeter. I am using it to test a database and I am
using the JDBC request samplers. I had a couple questions. I found the
explanation of results in the documentation but its not telling me what I
need to know.

Label - The label of the sample. 
# Samples - The number of samples for the URL 
Average - The average elapsed time of a set of results 
Min - The lowest elapsed time for the samples of the given URL 
Max - The longest elapsed time for the samples of the given URL 
Std. Dev. - the standard deviation of the sample elapsed time 
Error % - Percent of requests with errors 
Throughput - Throughput measured in requests per second/minute/hour 
Kb/sec - The throughput measured in Kilobytes per second 
Avg. Bytes - average size of the sample response in bytes. (in JMeter 2.2 it
wrongly showed the value in kB) 

Below is a sample of one of the rows returned for one of my JDBC requests.
With regards to avg, min, max what is the measurement it is returning?
Seconds? Millisecconds?

Also is it normal for the Error to be 100% and the Avg. Bytes = 0?

# Samples - 100 
Average - 8
Min - 0 
Max - 454       
Std. Dev. - 45.33773262967613
Error % 100.0%
Throughput - 6.5/sec
KB/sec - 0.0    
Avg. Bytes - .0

-- 
View this message in context: 
http://www.nabble.com/Summary-Listener-tf4793437.html#a13712897
Sent from the JMeter - User mailing list archive at Nabble.com.


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to