The HP SL6500 is among the first to include IOV on the motherboard. The server comes with a dual port Mellanox ConnectX-2 VPI adapter on the system board, providing a 40Gbps InfiniBand link and a 10Gbps Ethernet link. The Mellanox chipset supports the Remote Direct Memory Access (RDMA) over Converged Enhanced Ethernet (RoCEE, pronounced "rocky") specification, which runs Infiniband over Ethernet. Mellanox has been championing RoCEE for some time and was most recently implemented by Xsigo in its I/O Director, IOV over Ethernet introduction.
Including the chipset on the motherboard is supposed to let server manufacturers eliminate expansion cards needed IOV, but additional expansion cards may be needed anyway. "Some servers still need a x16 slot because not all I/O needs are covered by the IOV on the motherboard, specifically where customers want to place a native Fiber Channel card, a
SAS (Serial Attached SCSI) /Serial Advanced Technology Attachment (SATA) RAID controller, a Flash Solid State Drive (SSD) card or a Graphics Processing Unit (GPU), such as a PC-over-IP (PCoIP) card in the server," wrote Craig Thomson, vice president of marketing at Aprius.
IOV on the motherboard also enables IT to reduce costs because they don't have to pay for the IOV adapter card and IT doesn't have to spend the time to install IOV adapters in the server. Exactly how much the Mellanox IOV chipset will save is unclear, but it was "far less costly than a PCI-Express InfiniBand or 10GbE card," Ed Turkel, manager of worldwide HPC marketing for HP's Enterprise Servers, Storage and Networking group told The Register.
The inclusion of the Mellanox chipset is good news for IT managers as it signals that system vendors are taking IOV seriously. With IOV in the server, IT will find that they can increase the number of VMs per server and simplify the wiring and management of their virtualized clusters.