Hello together

I have some problems with the Delay, Jitter from Iperf.

How does Iperf calculate the Jitter Delay? When I measure with UDP the Jitter 
Delay over a 100Mb-Ethernet, the results are ok (avg delay: 124us, avg jitter 
31us).

I'm on my Bachelor-Thesis and I have to measure the Delay, Jitter and Packet 
loss ratio over a PPP-Device (3G) to a Server in the Internet in both ways, 
up- and downlink.

Iperf UDP downlink, 1800k (no loss):
avg Delay: 8.35ms
avg Jitter: 0.43ms (standard deviation).

When I variate the bitrate, the change is about 1 or 2ms.

This result can't be correct, because with thrulay the udp delay is avg 75ms.
With NDT I have an avg RTT 1142ms

After Internet research the Delay over 3G is between 70ms.....180ms

How does Iperf calculate the Delay? Is it a problem over 3G UMTS?

I work with Iperf 2.0.4-4 for UDP, on the computers is Ubuntu 9.04 installed.

Thanks

Regards
Tobias

------------------------------------------------------------------------------
Join us December 9, 2009 for the Red Hat Virtual Experience,
a free event focused on virtualization and cloud computing. 
Attend in-depth sessions from your desk. Your couch. Anywhere.
http://p.sf.net/sfu/redhat-sfdev2dev
_______________________________________________
Iperf-users mailing list
Iperf-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/iperf-users

Reply via email to