Think of your last trip to disney or your favorite amusement park. Lets define 
capacity of the ride to be the number of people that can sit on the ride per 
turn (think roller coaster). Throughput will be the number of people that exit 
the ride per unit of time. Lets define service time the the amount of time you 
get to sit on the ride. Lets define response time or latency to be your time 
queuing for the ride (dead time) plus service time. 

So, if there are no lineups at your favorite ride, latency is your time on the 
ride. Capacity remains the same but measured throughput will be minimal. As 
people start to arrive you may monetarily have more people than can fit in the 
roller coaster so they will start to queue. So, throughput will start to go up 
but average latency will also start to increase as a few people might have to 
queue. But as you can imagine, there will still be plenty of spare capacity 
(empty seats on the roller coaster). As the crowd moves in, the situation will 
change, system point of view, capacity will be better if not perfectly utilized 
which will maximize throughput. But, your latency is now time on ride + all 
that dead time in the queue. So from an individual's put of view, the rate at 
which you can work (rides per hour) will have decreased but from a system point 
of view, it's transactional rate is maximal.

Hardware is a bucket of non-shareable (seats on a roller coaster) things. 
Number of seats in a CPU == number of cores, number of seats on the network == 
number of bytes in the packet's payload... bus, memory, disk channels... and so 
on. So, there is a relationship between latency and throughput but in a 
computer is tends to be a lot more complicated than the simple roller coaster 
story. That said, I think the analogy can be used to help you sort out the 
concepts.

Regards,
Kirk

> Hey
> 
> You will have to improve the system to get the same performance time (Even
> Google for that matter). If the system remains the same, and if you increase
> the number of request even by one, the response time will increase. You will
> have to increase the hardware or tune the system to get the same performance
> time.
> 
> :)
> Deepak
> 
> --
> View this message in context: 
> http://jmeter.512774.n5.nabble.com/How-are-throughput-and-response-time-related-tp4433821p4437611.html
> Sent from the JMeter - User mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org
> For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org
For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org

Reply via email to