I was running a test on my HTTP server and the transfer speed went really slow when I switched the server from localhost(used a laptop) to a AWS EC2 t.micro server.

I want to know the difference between latency and load time(or sample time) when testing with JMeter. Load time is in "View results tree", and sample time is in "View results in table".

Here's my question.

  1. When sending a zip file that's about 3.5mb, it takes about 0.5 seconds when tested in localhost. However, when I was testing it on EC2 server, it takes about 6~8 seconds. I know that 3.5mb is quite big, but isn't 8 second too slow?

  2. During my tests, JMeter shows that the latency is about 0.5~1 second when the load time is 6~8 second. What is the difference between those two?


Solution 1:

Latency is a difference between time when request was sent and time when response has started to be received.

Response time (= Sample time = Load time = Elapsed time) is a difference between time when request was sent and time when response has been fully received.

So Response time always >= latency.

The larger file is, the larger difference between response time and latency will be.

Solution 2:

Latency = 922 ms means that it takes 922 ms as the first response of the request from when the request has been made.

Sample Time(or Response Time) = 1232 ms means that it takes 1232 ms to process the request from when it has been made.

Therefore, Response Time = Latency + Processing Time