Lee H. Badman

Network Computing Blogger


Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

WLAN Stress Tests: What’s the Use?

Independent testing of vendor products is a rare and valuable thing. That's certainly true in the wireless networking market, as WLAN makers are pushing out ever more access points and associated hardware, and wireless architectures are becoming staggeringly complex. How do potential customers know what product is right for them, or if their current vendor is delivering the goods as advertised?

Independent test results can help--as long as they are kept in proper context. One of the more interesting independent tests comes from Keith Parsons, of Wireless LAN Professionals.

More Insights

Webcasts

More >>

White Papers

More >>

Reports

More >>

Parsons' "Wi-Fi Stress Test" is touted as a repeatable, vendor-independent access point analysis. The goal of the test was simple: pit an increasing number of Apple iPads against a single AP until the AP crumbled, and measure the same data points along the way for each unit under test.

APs from Aerohive, Aruba, Cisco, HP, Juniper, Linksys, Meraki, Ruckus, Ubiquity and Xirrus were provided by each manufacturer, and the rest of the test environment was composed of client devices and test gear owned by Wireless LAN Professionals.

Parsons' testing team included representatives from seven WLAN vendors and two dozen volunteers unaffiliated with the vendors who were eager to learn from some of the best in the industry. Two weeks ago, I spent time with Parsons and fellow industry experts and analysts at Wireless Field Day, and the Stress Test got a lot of attention.

Rather than regurgitate the results, I want to share my impressions of what I liked about Parsons' approach, and what I'm not so keen on.

The Ups and the Downs

If you read Parsons' report, you'll see he goes to great pains to explain the parameters of his test, as well as its limitations. In other words, he's not making this test out to be anything more than what it is.

His goal was to see what APs could stand up to the max load, measured in iPads pushing known traffic. That's it. He wasn't crowning anyone with the title of The Best Wireless System. He was clear with his methods, he had a great team of testers, and he kept an eagle-eye on the vendors that participated. No configuration or performance-optimizing was tolerated where vendors had an active hand in testing. At the end of it, Ruckus did well in this exercise, with Cisco close behind.

[ Join us at Interop Las Vegas for access to 125+ IT sessions and 300+ exhibiting companies. Register today! ]

What didn't I care for in this analysis? Though Parsons made it crystal clear that he doesn't consider this a "real-world" test of WLAN products, I was puzzled that all testing used 20 MHz wide channels in the 5 GHz band of 802.11n. I don't know of a single network that is configured for other than 40 MHz channels, as that was one of the prime drivers for migrating to 11n from legacy 11a/g technology. I also feel conflicted about a iPad-only testbed, as in my own network experiences I find Apple products to be maddeningly inconsistent depending on their own OS version combined with the specific code found on the infrastructure APs and controllers.

Parsons also points out that his analysis excluded many other critical functions of modern business WLAN: feature sets, management interfaces, location services and all of the other pieces that add up to TCO. Put another way, even though this stress test was not billed as real world (and yes, everyone's version of that notion varies), it was a bit too far away from real world for my liking. I learned what APs best stand up to huge volumes of traffic as delivered in the testing under specific parameters, but that is arguably a relatively minor data point in the bigger story of WLAN ownership.

At the same time, I truly appreciate Parsons' efforts to undertake this task and to gather the resulting data. I look forward to how Parsons will build on it in future tests, and wouldn't mind participating in the next round.


Related Reading


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Research and Reports

Network Computing: April 2013



TechWeb Careers