Networking

03:37 PM
Lee Badman
Lee Badman
Commentary
Connect Directly
Twitter
LinkedIn
RSS
E-Mail
50%
50%
Repost This

A Ruckus Over WLAN Testing

A competitive review of WLAN products at Syracuse University stirred up controversy because of a vendor’s involvement in the tests. But the real question is, are they testing the right things?

When it comes to WLAN performance, you might assume all access points are created equal because all the vendors are "doing" 802.11n. But a recent competitive test of five vendors conducted at Syracuse University, where I run the wireless network, shows that performance can vary greatly.

According to the testing, APs from Ruckus Wireless took top marks for throughput, besting similar products from Aerohive, Aruba, Cisco and Meraki. However, as the report points out, the testing was conducted in partnership with Ruckus, which provided guidance about the test plans. As you'd expect, this has raised legitimate complaints about the results from vendors and IT professionals.

I have my own feelings about the testing, but before I get into it, a bit of context is in order. The testing I'm about to describe took place in Hinds Hall on the Syracuse University Campus, where SU's School of Information Studies (iSchool) has its Center for Convergence and Emerging Networking Technologies (CCENT). Professor of Practice Dave Molta coordinated the testing with Ruckus Networks. A select group of graduate students with WLAN skills conducted seven months of testing under Molta's supervision. The group put five WLAN vendors' APs (and controllers where appropriate) through hundreds of tests in a number of real-world scenarios.

Molta happens to be my boss when I wear my Adjunct Faculty hat, and he has a long history with comparative testing and industry analysis, including stints as senior technology editor and editor in chief for Network Computing in the 1990s. Molta is also an occasional contributor to Network Computing and InformationWeek Reports.

I manage Syracuse University's large wireless environment as part of the central IT team. Despite working closely with Molta and the iSchool on many levels, neither I nor the SU's central IT support group was involved with the Ruckus-sponsored testing. Got all that?

As for the comparative review, here's my own take. When I compare the approach of Molta's team to boilerplate wireless testing guidelines like this one from Atheros , I have respect for the approach used at the iSchool. There is nothing easy about testing wireless gear across different vendor platforms.

At the same time, I don't have enough information on each vendor's configurations and specifics on all clients used in the various testing scenarios to be able to say that I completely buy all of the results. I know that Aerohive, Aruba, Cisco and Meraki all have their own opinions on the conclusions reached by Molta's team, as well.

In particular, I have two concerns with the testing. First, even if I completely accepted the outcome, I'd still have to ask, "So what?" My next major purchase decision will be around 11ac products, so outside of professional curiosity, I don't really care which vendor has the best 11n APs. And even if I were looking to extend an existing 11n infrastructure, different vendor APs tend not to play well together, so I'd probably stick with my incumbent anyway.

My other issue is that I'm frustrated by the continued emphasis on throughput as the hallmark of a great WLAN. I'm not foolish enough to say I don't care about throughput, but there's far more to a successful WLAN implementation than simply having the fastest APs on the planet.

For example, Meraki took a beating as measured by test results, yet Meraki's wireless TCO and ease of administration, driven by its cloud-managed framework, is the stuff of dreams--so much so that I think it's a major reason Cisco is buying the company. (On a side note, my own limited performance testing on a 35-AP Meraki WLAN that I run differs greatly from the results found by Molta's team.)

Many major WLAN players lose customers to the likes of Meraki and Aerohive because the big guys' management systems are so complex that they cause pain and suffering. I'd love to see a competitive test of vendors' WLAN management systems in which customer satisfaction played a role. Molta and his students do acknowledge this issue in the report: "We did not compare total cost of ownership, ease of management, or advanced features and functionality, all of which may be more important than raw throughput in some environments."

I applaud the Syracuse team for the work it put into the test, but I question how seriously it will be taken. Just as Ruckus will shop around the results to woo potential customers, competitors have legitimate grounds to question the validity of those results. For those of us in the wireless trenches, we want both speed and effective enterprise-class management in equal parts. Looking forward to 11ac, I'd love to see early, independent testing that measures both.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
keith23
50%
50%
keith23,
User Rank: Apprentice
2/20/2013 | 3:04:32 PM
re: A Ruckus Over WLAN Testing
Thanks for this post! I appreciate your effort there's so much information I can use. You can check also here http://www.wlanpros.com/update... for more information about Wifi Technology...
ANON1251128724375
50%
50%
ANON1251128724375,
User Rank: Apprentice
12/3/2012 | 1:34:16 PM
re: A Ruckus Over WLAN Testing
I agree there are many factors that impact Wi-Fi performance but if you could build a model that factors in everything, physical-layer signaling is the most important.

Regarding the value of 11n performance testing, my assertion would be that a vendor's performance with 3-stream, dual-band 11n, the current state-of-art for enterprise WLAN, is likely to be highly correlated with the performance they achieve with new 802.11ac products. We will just have to wait to test that hypothesis.

Your broader point about the value of performance testing as relates to product selection is interesting. As you note, many large organizations, especially colleges and universities, are already heavily invested in a WLAN vendor, which makes it difficult to switch. But that's been a problem throughput the history of networking. Vendors will adhere to standards (most of the time) but add proprietary elements that lock you into their offering. That's especially true of enterprise Wi-Fi solutions. It's likely that as standards continue to evolve, more of the core functionality you care about will be based on standards but it is unlikely that we will ever see seamless interoperability between different vendors' products. Perhaps the best you can hope for is a cross-platform management framework like Aruba's AMP.

The other important point I would make is that there are a lot of enterprises that are not as committed to a specific vendor as you are, and there are new global markets that are just beginning to be tapped. These organizations still value independent analysis. Further, competitive testing puts pressure on existing vendors to improve their products. Of course, people can still question whether our analysis is objective and/or applicable to a specific environment. I just hope these people will take the time to read our report before concluding this was just another "pay-to-test" project.
LeeBadman
50%
50%
LeeBadman,
User Rank: Apprentice
12/2/2012 | 9:27:49 PM
re: A Ruckus Over WLAN Testing
Thanks, Dave- good commentary and good dialogue. Thrice I shall attempt to make the point that neither you or Mr. Callisch are touching, as perhaps I didn't stitch my words together for clarity:

As I mentioned in the blog post, I agree that to say throughput isn't important is indeed silly. But, Ruckus came to the 11n party late, and where the 11n investment has already been made, the thought of changing vendors mid-stream is neither practical nor economically viable. And multiple WLAN products don't co-exist well in the same environment as everyone is making their own magic-in-the-middle ever more proprietary and exclusionary for inter-vendor interoperability to the point of the antenna feed.

That's why, for the third time, I say for environments like mine that 11ac is the next interesting juncture where making a large-scale change to the WLAN may be a consideration. For those shopping for 11n now, yes the testing you did should provoke consideration and a further look at what Ruckus is about.

I will half-disagree with your statement "The reasons one WLAN vendor outperforms another vendor is mostly related to physical-layer signaling and mitigation of RF interference" as it is incomplete.

Having lived through it a time or three, I would add that how crappy the code is underneath the controller/AP operation is in regards to various timers and murky, hidden performance-impacting settings that only the 8th circle of Tech Support are privy to also can make a tremendous impact on how a given vendor appears to perform with all or specific clients. It may all equal physical-layer signaling at the end of the day, but the same radios and antennas can be horrific or wonderful depending on whether a given vendor's QA department had their coffee before pushing out a specific code version.
ANON1251128724375
50%
50%
ANON1251128724375,
User Rank: Apprentice
12/2/2012 | 6:29:16 PM
re: A Ruckus Over WLAN Testing
As the person who has overseen this testing and the report preparation, I have been following blog and twitter reaction with interest.

I was involved in hundreds of competitive testing projects while working as an editor with Network Computing between 1991 and 2008, the first 7 years while also serving as Director of Network and System Services at Syracuse University (in the organization where Lee currently works), the last 10 years as a professor of practice and program director in the School of Information Studies. During that time, I'm sure I would have looked at a project of this type with suspicion, and from that perspective, it wasn't easy for me to say yes when the folks at Ruckus approached us to do testing. However, I finally concluded that there was value to my students and to the industry in doing this work and in fact, it was the only way to do this kind of testing. In a perfect world, I would take the Consumer Reports approach, purchasing all of the gear independently, but there isn't a viable business model for that. And vendors will not agree to voluntarily participate in an independent tests as they sometimes did during Network Computing's glory days. Even in those days, it was tough to get the leading vendors to participate.

The observation of Lee and others that performance is not the only factor IT managers consider when purchasing network products is an assertion I totally agree with and we explicitly acknowledged that fact in the report. However, to conclude that performance is unimportant, at a time of explosive growth in demand for WLAN access, strike me as a little silly. The fact is that there were significant differences in performance between these products and these differences do matter. That doesn't mean you can't succeed in the market with a product that underperforms relative to the competition. In many enterprise environments, 802.11a/g performance is fully adequate to meet current needs. But most network managers want some performance headroom, as they have for many years, and new applications and higher user densities are making performance ever more important.

As Lee points out, 802.11ac promises to further up the ante on performance. That's good news, though I personally feel it's of secondary importance in terms of real impact of 11ac on the enterprise. The most significant benefit of 11ac for IT folks lies in the fact that it will force client vendors to use dual-band radios. Beyond that, 11ac's increased performance comes primarily from wider channels, more advanced modulation, and more spatial streams. I don't understand Lee's conclusion that a test of 3-stream MIMO offerings is not interesting because 11ac is coming soon. If anything, it is MORE interesting. The reasons one WLAN vendor outperforms another vendor is mostly related to physical-layer signaling and mitigation of RF interference. I believe our tests provide support to that hypothesis.

It is fair to question whether Ruckus cooked these tests so they would finish first. In answering that question, it's important to distinguish between the different tests we ran. The first test, rate versus range, single client per band, is simple test performed using common industry tools that every credible enterprise vendor has or should have run in-house as part of their quality control process. However, while simple single-client tests do provide meaningful data, they aren't nearly as interesting as high-density testing. Frankly, that's where are report is most vulnerable because you can't expect other vendors to actively participate in complex tests that have been sponsored by one vendor. We understand that. We did our best to optimize performance of each product for those tests (we sometimes ran multiple tests for vendors to identify the configuration that worked best), but it is probably the case that some knob or dial could have been tweaked to produce better throughput. We are willing to make our configuration files available to anyone who is interested.

Dave Molta
LeeBadman
50%
50%
LeeBadman,
User Rank: Apprentice
12/1/2012 | 5:40:55 PM
re: A Ruckus Over WLAN Testing
You lost me with the last comment, but I am glad you took the time to share your opinions. Best of luck to you and Ruckus, and Happy Holidays.
fecklish
50%
50%
fecklish,
User Rank: Apprentice
12/1/2012 | 5:04:20 PM
re: A Ruckus Over WLAN Testing
Yes. Ok. I read the article three times. The tone and direction was clearly...well let's say.....cynical at best. But hey, you're a semi-journalist :)
LeeBadman
50%
50%
LeeBadman,
User Rank: Apprentice
11/30/2012 | 11:52:04 PM
re: A Ruckus Over WLAN Testing
David,

It was a bad day for the IT world when Network Computing had to shutter the Real World Labs of old, where many years of quality testing was done on a range of network technologies. But business models and times change, and I agree that their is a void in testing. I take some issue with your reaction, as I'm not raising the bullshit flag on the project. If ever this sort of testing could be in good hands, Molta's hands are the ones to put it in. My points are:

- that other vendors will pan the results because of the Ruckus involvement, as Ruckus would likely cast suspicions if another vendor were in play in this scenario

- that where a network is already entrenched in an 11n deployment, like my own, going to another vendor is not easy because there is zero interoperability among WLAN systems, and that sucks because some of us would actually have an interest in trying other wireless solutions if these were more like standards-based Ethernet switches where who's sticker is on the box is somewhat irrelevant when it comes to basic core functionality. I never said I don't believe Ruckus scored well, I'm saying to my environment it's largely irrelevant because I won't be contemplating a change of vendor until 11ac knocks on the door in it's mature clothes. New 11n customers will rightly see things differently.

- as mentioned, as the manager of one of the largest WLAN environments in my part of the country, I can say that the management system is absolutely as important as the APs, and at times even more so. Dazzle me with speed during testing, but if there is no testimony on the management side, I don't hear a complete song being sung.

- On the Meraki point, I did limited "testing" with MR16 APs against Cisco 1430s and 3500s and found (and still find daily) that they both serve dozens of real-world wireless clients as equally well as you'd hope from the perspective of the guy providing access. It seems to me that if in reality Meraki was as bad as the testing showed, they probably wouldn't have stayed in business this long.

By saying that I don't quite buy the entire validity of the testing as presented, that is a far cry from denouncing or poo pooing all of it.

Respectfully,

Lee Badman
fecklish
50%
50%
fecklish,
User Rank: Apprentice
11/30/2012 | 10:53:56 PM
re: A Ruckus Over WLAN Testing
Appreciate your perspective and comments, Lee. Sincerely. And like you mention, it's difficult to conduct comparative tests with Wi-Fi as everything changes all the time.

So, quite frankly, performance or not, you need an adaptive type system.

In fact, it's become so difficult to provide the market with conclusive and quantitative data that publications, like Network Computing, have simply given up attempting to even do such testing. So the onus has fallen on vendors.

But when we try to do them, as independently as we can, people cry bullshit. No problem. Try it yourself. Publish your results. We are simply just trying to help enterprise managers, who don't have the time, facilities, products, test tools or people to do such testing with SOME help. You can poo-poo it all you want. At least we are making some (yes always self-serving to some degree) effort.

We specifically approached Syracuse because we believed that they would approach such a project with integrity, independence and an objective perspective. And they did.

There are MANY tests that we did not win. But yes, we won a lot. That's probably because our APs work pretty well - whether you believe it or not.

david callisch
ruckus wireless
More Blogs from Commentary
Infrastructure Challenge: Build Your Community
Network Computing provides the platform; help us make it your community.
Edge Devices Are The Brains Of The Network
In any type of network, the edge is where all the action takes place. Think of the edge as the brains of the network, while the core is just the dumb muscle.
Fight Software Piracy With SaaS
SaaS makes application deployment easy and effective. It could eliminate software piracy once and for all.
SDN: Waiting For The Trickle-Down Effect
Like server virtualization and 10 Gigabit Ethernet, SDN will eventually become a technology that small and midsized enterprises can use. But it's going to require some new packaging.
IT Certification Exam Success In 4 Steps
There are no shortcuts to obtaining passing scores, but focusing on key fundamentals of proper study and preparation will help you master the art of certification.
Hot Topics
2
IT Certification Exam Success In 4 Steps
Amy Arnold, CCNP/DP/Voice,  4/22/2014
2
Edge Devices Are The Brains Of The Network
Orhan Ergun, Network Architect,  4/23/2014
1
Heartbleed Flaw Exploited In VPN Attack
Mathew J. Schwartz 4/21/2014
White Papers
Register for Network Computing Newsletters
Cartoon
Current Issue
2014 Private Cloud Survey
2014 Private Cloud Survey
Respondents are on a roll: 53% brought their private clouds from concept to production in less than one year, and 60% ­extend their clouds across multiple datacenters. But expertise is scarce, with 51% saying acquiring skilled employees is a roadblock.
Video
Slideshows
Twitter Feed