Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

A Ruckus Over WLAN Testing

When it comes to WLAN performance, you might assume all access points are created equal because all the vendors are "doing" 802.11n. But a recent competitive test of five vendors conducted at Syracuse University, where I run the wireless network, shows that performance can vary greatly.

According to the testing, APs from Ruckus Wireless took top marks for throughput, besting similar products from Aerohive, Aruba, Cisco and Meraki. However, as the report points out, the testing was conducted in partnership with Ruckus, which provided guidance about the test plans. As you'd expect, this has raised legitimate complaints about the results from vendors and IT professionals.

I have my own feelings about the testing, but before I get into it, a bit of context is in order. The testing I'm about to describe took place in Hinds Hall on the Syracuse University Campus, where SU's School of Information Studies (iSchool) has its Center for Convergence and Emerging Networking Technologies (CCENT). Professor of Practice Dave Molta coordinated the testing with Ruckus Networks. A select group of graduate students with WLAN skills conducted seven months of testing under Molta's supervision. The group put five WLAN vendors' APs (and controllers where appropriate) through hundreds of tests in a number of real-world scenarios.

Molta happens to be my boss when I wear my Adjunct Faculty hat, and he has a long history with comparative testing and industry analysis, including stints as senior technology editor and editor in chief for Network Computing in the 1990s. Molta is also an occasional contributor to Network Computing and InformationWeek Reports.

I manage Syracuse University's large wireless environment as part of the central IT team. Despite working closely with Molta and the iSchool on many levels, neither I nor the SU's central IT support group was involved with the Ruckus-sponsored testing. Got all that?

As for the comparative review, here's my own take. When I compare the approach of Molta's team to boilerplate wireless testing guidelines like this one from Atheros , I have respect for the approach used at the iSchool. There is nothing easy about testing wireless gear across different vendor platforms.

At the same time, I don't have enough information on each vendor's configurations and specifics on all clients used in the various testing scenarios to be able to say that I completely buy all of the results. I know that Aerohive, Aruba, Cisco and Meraki all have their own opinions on the conclusions reached by Molta's team, as well.

In particular, I have two concerns with the testing. First, even if I completely accepted the outcome, I'd still have to ask, "So what?" My next major purchase decision will be around 11ac products, so outside of professional curiosity, I don't really care which vendor has the best 11n APs. And even if I were looking to extend an existing 11n infrastructure, different vendor APs tend not to play well together, so I'd probably stick with my incumbent anyway.

My other issue is that I'm frustrated by the continued emphasis on throughput as the hallmark of a great WLAN. I'm not foolish enough to say I don't care about throughput, but there's far more to a successful WLAN implementation than simply having the fastest APs on the planet.

For example, Meraki took a beating as measured by test results, yet Meraki's wireless TCO and ease of administration, driven by its cloud-managed framework, is the stuff of dreams--so much so that I think it's a major reason Cisco is buying the company. (On a side note, my own limited performance testing on a 35-AP Meraki WLAN that I run differs greatly from the results found by Molta's team.)

Many major WLAN players lose customers to the likes of Meraki and Aerohive because the big guys' management systems are so complex that they cause pain and suffering. I'd love to see a competitive test of vendors' WLAN management systems in which customer satisfaction played a role. Molta and his students do acknowledge this issue in the report: "We did not compare total cost of ownership, ease of management, or advanced features and functionality, all of which may be more important than raw throughput in some environments."

I applaud the Syracuse team for the work it put into the test, but I question how seriously it will be taken. Just as Ruckus will shop around the results to woo potential customers, competitors have legitimate grounds to question the validity of those results. For those of us in the wireless trenches, we want both speed and effective enterprise-class management in equal parts. Looking forward to 11ac, I'd love to see early, independent testing that measures both.