I would like to add infiniband capability to two amd64 systems in my lab. These have PCI-X Tyan S2882 motherboards. I'm running linux on these systems.
Can infiniband function properly in a physical point to point topology? I would prefer to not invest in switches if it's not entirely necessary. The documentation I've glanced at so far seems to indicate that infiniband is logically point-to-point. But the diagrams representing physical topology all seem to have switches. Other than obviously not being able to communicate across a fabric, is there functionality missing from a point-to-point topology? Does it even work?
What's the best HCA card for me? I'm drawn to Mellanox MT23108-based systems because OpenIB seems to support them well (https://openib.org/tiki/tiki-index.php?page=MellanoxHcaFirmware). I'm considering purchasing two HP NC570C cards (http://h30094.www3.hp.com/product.asp?sku=2603660&), which are based on that Mellanox chip. Does anybody know if those cards would be functional in non-hp gear? In other words, does HP have their own proprietary goodies in the cards that would cause them to break in generic systems such as what I have? Are there better cards for me?
Thank you very much for your time and advice.
Sincerely,
Mark Travis
New Yahoo! Messenger with Voice. Call regular phones from your PC and save big.
_______________________________________________ openib-general mailing list [email protected] http://openib.org/mailman/listinfo/openib-general
To unsubscribe, please visit http://openib.org/mailman/listinfo/openib-general
