We are attempting to stress a wireless link and monitor the performance of the 
802.11 protocol.  We have one unit under test as a client that reports an 
interval bandwidth close to ??M while the server is reporting much lower 
bandwidths in each interval with high UDP packet loss.  The wireshark data 
indicates significant 802.11 packet loss.

A different client (PC) reports a bandwidth at each interval similar to the 
server report for the respective interval and the server does not report high 
UDP packet loss.  The wireshark captures illustrate a similar effort of 802.11 
retries retrying more times on a particular sequence frame resulting in lower 
drops.

On the PC client, the client summary report for bandwidth is close to but never 
matches the server report even accounting for packet loss.

We are confused about the inner workings of iperf which raises questions of the 
validity of the reports under these conditions.


1.      Does the iperf client measure the interval bandwidth by the amount of 
data it wrote do a buffer per interval or by the amount of data the radio 
pulled out of the buffer per interval or the amount of data the radio put on 
the air?  The first two should be equivalent.

2.      The other question I should ask but cannot think of.

Thanks in advance for any assistance you can provide.
------------------------------------------------------------------------------
The Go Parallel Website, sponsored by Intel - in partnership with Geeknet, 
is your hub for all things parallel software development, from weekly thought 
leadership blogs to news, videos, case studies, tutorials, tech docs, 
whitepapers, evaluation guides, and opinion stories. Check out the most 
recent posts - join the conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Iperf-users mailing list
Iperf-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/iperf-users

Reply via email to