Historically, the most significant Ethernet price move was probably in the late 1980s, when Novell brought out the NE1000 Ethernet cards for $495. This price was significantly below what then-market leaders 3Com and IBM were charging--$700 to $900 a card. Novell realized that it was to its advantage to push the total cost of an office LAN down, even if it broke even on NICs because that would boost NetWare sales.
Since I ran the numbers two years ago, in this blog post about Fibre Channel over Ethernet pricing, top-of-rack switches have come down from more than $500 a port to less than $300 for switches like Cisco's Nexus 5548 and Dell/Force10's S4810. Similarly, the street price for a 10-Gbps NIC has dropped from almost $2,500 for a first-generation CNA to about $600 for a dual-port Intel X520.
That was before Mellanox announced that its 64 port SX-1016 switch would cost about the same $12,000 a 20-port Nexus 5010 would have set you back two years ago, or just $188 a port. It's a low-latency (250ns), non-blocking switch, but data center-centric features like data center bridging and FCoE FIP snooping will have to wait for a future firmware update. Mellanox is also selling its ConnectX EN dual-port card for less than $400, bringing the total cost of dual-homing a server to a 10-Gbps network under the magic number of $1,000.
Mellanox may not be a household name yet in the world of corporate networking, but the company is far and away the leading manufacturer of the ultra-low-latency InfiniBand networks that dominate the world of high-performance computing. That experience, and volume, empowers Mellanox to develop its own silicon for both its adapters and switches. It can use a single multiprotocol ASIC to run InfiniBand, Ethernet and even Fiber Channel over Ethernet. I've noticed that Mellanox is coming up more and more often when I ask its competitors who they're worried about.
Well, as regular readers of this blog know all too well, NICs and switch ports are not really the total cost; there's still the little matter of SFP+ modules and cables, which range from $100 to $1,000 a link for twinax direct-attach cables. That's where Emulex' 10Gbase-T NIC and NIC/iSCSI CNA work to drive costs down. Using a $5 Cat6A cable to connect a 10GBase-T-equipped server to a top-of-rack switch is a lot more cost-effective than using $1,000 optics or even $100 twinax cables.
The Romley generation of Xeon servers will present huge opportunities for 10-Gbps Ethernet vendors as users care a lot more about their 10-Gbps adapters than they really cared if a server's 1-Gbps LOM was from Intel or Broadcom. So server vendors may move to mezzanine cards, making for more choices for us all.
Emulex has been a client of DeepStorage in the past. I'm meeting with the CEO for Mellanox this week and will make him buy lunch.