Lee H. Badman

Network Computing Blogger


Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

A Ruckus Over WLAN Testing

When it comes to WLAN performance, you might assume all access points are created equal because all the vendors are "doing" 802.11n. But a recent competitive test of five vendors conducted at Syracuse University, where I run the wireless network, shows that performance can vary greatly.

According to the testing, APs from Ruckus Wireless took top marks for throughput, besting similar products from Aerohive, Aruba, Cisco and Meraki. However, as the report points out, the testing was conducted in partnership with Ruckus, which provided guidance about the test plans. As you'd expect, this has raised legitimate complaints about the results from vendors and IT professionals.

More Insights

Webcasts

More >>

White Papers

More >>

Reports

More >>

I have my own feelings about the testing, but before I get into it, a bit of context is in order. The testing I'm about to describe took place in Hinds Hall on the Syracuse University Campus, where SU's School of Information Studies (iSchool) has its Center for Convergence and Emerging Networking Technologies (CCENT). Professor of Practice Dave Molta coordinated the testing with Ruckus Networks. A select group of graduate students with WLAN skills conducted seven months of testing under Molta's supervision. The group put five WLAN vendors' APs (and controllers where appropriate) through hundreds of tests in a number of real-world scenarios.

Molta happens to be my boss when I wear my Adjunct Faculty hat, and he has a long history with comparative testing and industry analysis, including stints as senior technology editor and editor in chief for Network Computing in the 1990s. Molta is also an occasional contributor to Network Computing and InformationWeek Reports.

I manage Syracuse University's large wireless environment as part of the central IT team. Despite working closely with Molta and the iSchool on many levels, neither I nor the SU's central IT support group was involved with the Ruckus-sponsored testing. Got all that?

As for the comparative review, here's my own take. When I compare the approach of Molta's team to boilerplate wireless testing guidelines like this one from Atheros , I have respect for the approach used at the iSchool. There is nothing easy about testing wireless gear across different vendor platforms.

At the same time, I don't have enough information on each vendor's configurations and specifics on all clients used in the various testing scenarios to be able to say that I completely buy all of the results. I know that Aerohive, Aruba, Cisco and Meraki all have their own opinions on the conclusions reached by Molta's team, as well.

In particular, I have two concerns with the testing. First, even if I completely accepted the outcome, I'd still have to ask, "So what?" My next major purchase decision will be around 11ac products, so outside of professional curiosity, I don't really care which vendor has the best 11n APs. And even if I were looking to extend an existing 11n infrastructure, different vendor APs tend not to play well together, so I'd probably stick with my incumbent anyway.

My other issue is that I'm frustrated by the continued emphasis on throughput as the hallmark of a great WLAN. I'm not foolish enough to say I don't care about throughput, but there's far more to a successful WLAN implementation than simply having the fastest APs on the planet.

For example, Meraki took a beating as measured by test results, yet Meraki's wireless TCO and ease of administration, driven by its cloud-managed framework, is the stuff of dreams--so much so that I think it's a major reason Cisco is buying the company. (On a side note, my own limited performance testing on a 35-AP Meraki WLAN that I run differs greatly from the results found by Molta's team.)

Many major WLAN players lose customers to the likes of Meraki and Aerohive because the big guys' management systems are so complex that they cause pain and suffering. I'd love to see a competitive test of vendors' WLAN management systems in which customer satisfaction played a role. Molta and his students do acknowledge this issue in the report: "We did not compare total cost of ownership, ease of management, or advanced features and functionality, all of which may be more important than raw throughput in some environments."

I applaud the Syracuse team for the work it put into the test, but I question how seriously it will be taken. Just as Ruckus will shop around the results to woo potential customers, competitors have legitimate grounds to question the validity of those results. For those of us in the wireless trenches, we want both speed and effective enterprise-class management in equal parts. Looking forward to 11ac, I'd love to see early, independent testing that measures both.


Related Reading


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Next Gen Network Reports

Research and Reports

Network Computing: April 2013



TechWeb Careers