2011/3/30 Peter Hallin <peter.hal...@ldc.lu.se>

> Ok, now we have been doing some testing and probably found the problem.
>
> All tests were done on the same machine with an Intel S5000VSA MB and a
> Xeon E5420 2,5 Ghz processor, running OpenBSD 4.8 amd64 GENERIC (SP
> kernel).
>
> We tested the performance with iperf, running two clients connected
> through a bridge.
>
> With the Intel Pro/1000 PCIe (82576) dual port cards with the bridge
> between two cards (in this case em0 to em2) we got the worst
> performance, 150 Mbit/s.
>
> While testing this and watching with 'systat vmstat', the CPU was 99% busy
> handling interrupts. The amount of interrupts were about 3000/s on em0
> (where the iperf client was connected) and 1500/s on em2 (iperf server).
> At the same time 'systat ifs' showed about 10 new livelocks per second.
>
> Next we tested regular PCI Intel Pro/1000MT (82545GM) cards and now we
> got the performance we had hoped for in the first place. 910 Mbit/s with
> 8000 intr/s on both cards at 50% CPU (intr). No livelocks.
>
> We thought perhaps the issue was related to the PCIe bus, so we did one
> final test, this time with quad port Intel Pro/1000 QP (82576) PCIe
> cards.
>
> These performed excellent, with 940 Mbit/s, 8200 intr/s per card and 60%
> CPU (intr).
>
> So, it seems the dual port PCIe cards suck and we have to replace them.
>
> //Peter
>
> Just as curiosity:

Did you used both ports from the Intel Pro/1000 PCIe (82576)?

And if is used a single port PCI-Ex Intel Card?

Reply via email to