I have a VMWare Farm at work - 15 boxes all running VMWare Server 1.0.3
under CentOS 4.4. They all talk to a Dell PowerEdge 2900 with 8 x 7200
RPM 500GB SATA drives, backed by a Dell Perc5/i hardware RAID controller.
I have the RAID configured as RAID5 with a single hot spare, so I have 7
spindles for data.
I'm exporting my data box with NFS. I have dual bonded GigE nics on the
storage box, and each of my VMWare Server boxes talk to the storage box
over a dedicated storage VLAN. (Each of the VMWare Server nics has a
single Broadcom PCI-e nic dedicated to just storage traffic.)
Performance on the box is abysmal. Any Windows 2003 64 bit VM that runs
off NFS will bluescreen on while under any sort of i/o bound load.
I moved some test images on over to our Netapp, and they run perfectly.
I haven't been able to get into it too deeply, but this experience has
sort of soured me on do-it-yourself NAS devices. Certainly they're
probably okay for home use, but DIY storage needs to come a long way to
match the i/o throughput available from dedicated storage boxes.