Testing Enterprise Wireless: Good, Bad and Ugly

Following a small post-review storm we recently weathered, Dave Molta reflects upon the power and limits of independent technology evaluations, with specific focus on how we test products for our

Dave Molta

March 11, 2005

6 Min Read
Network Computing logo

That point was reinforced recently when we published, in the February 17, 2005issue of Network Computing, our analysis of the WLAN switch market,

including a product review by Frank Bulk of our Syracuse lab. Now thatthe post-review storm has settled a bit, it seemed like a good time forsome reflection about the power and limits of independent technologybakeoffs.

Our Process

Although we plan our edit calendar at least a year in advance, weusually have about six months to plan and execute a typical NetworkComputing cover story and product review. Drawing on our understandingof the technology, we define initial evaluation criteria and a drafttest-plan. From a big-picture perspective, we evaluate key systemattributes critical to enterprise IT, including performance,scalability, manageability, availability and cost-of-ownership. Wherepossible, we use automated test tools, often supplemented by fieldtesting, and we usually develop pricing scenarios based on MSRP toprovide objective cost comparisons knowing that the actual acquisitioncost is almost always negotiable. Like price comparisons, benchmarkingprovides us with comparative data, but it is at best a simulation ofreality, and we always try to point out limitations. Still, we feel weare able to provide IT professionals with objective information relevantto their own product selection.

Beyond complaints from vendors whose products don't win our Editor'sChoice designation, experienced IT professionals often react to ourreviews, pointing out weaknesses in test tools and methodology and oftencritiquing our weighting of evaluation criteria. Those comments led usto develop our Interactive Report Card, which allows readers to tailorevaluation weightings to their environment. We've also hired some of ourbiggest critics over the years and challenged them to do a better job.Most vendors loathe the review process, but still do their best tocontrol the evaluation procedures from start to finish. The best onesprovide high-quality information and ensure that we have quick access totheir most capable staff, just in case something goes wrong, as it oftendoes, especially when we are pushing the testing envelope. Like you,when we have problems during testing, we expect a quick response, notexcuses. By the end of the review process and usually a month or moreprior to publication, vendors have a general sense for how they fared.Short of final results, we share as much as we can with them, especiallyproblems we encountered. By the time publication rolls around, vendorsare well-positioned to spin the story to industry analysts, prospectivecustomers, their internal employees and whoever else is willing tolisten.

The Spin

In our recent enterprise WLAN infrastructure review, we set a high barfor participation, something we call a product filter. We requiredcentralized management, multi-layer security, dual-band radio support,QoS and advanced RF management capabilities, all features readers havetold us are important for their next-generation WLAN designs. Only fourvendors--Airespace, Aruba, Cisco and Trapeze--agreed to participate. Youcan't necessarily conclude that those who didn't show had something tohide, but you have to respect the vendors who did participate. As wenoted in our cover story, all four should be on the short list of mostenterprises.

The Mobile Observer


Sign up today for our weekly newsletter, providing unique, in-depth coverage of mobile technologies.

Flattering participants by suggesting they are all winners in our bookdidn't hold much weight when the final results were published lastmonth. The first e-mail came from Airespace, the winner, congratulatingus on our fine work. Gee thanks, guys. You really think we're that good?It took a little longer for Aruba and Trapeze to react, but react theydid. Cisco, perhaps because of regulations surrounding the Airespaceacquisition, kept pretty quiet.It's entirely legitimate for vendors to highlight factual errors inreviews, and though most assertions of factual error turn out to be moregrey than black or white, we are obliged to seriously consider thesecomplaints. There were assertions that Airespace and Aruba had providedincorrect pricing to improve their scores in our pricing scenarios.After assessing the situation, we concluded that both were guilty, ifnot of deception than certainly of negligence. Airespace quoted uspricing for a single-radio AP that it should have known we would notrecommend. Aruba quoted us a "direct-sale" price on its AP and "bundledpricing" on switch software, even though we explicitly requested MSRPfor all software and hardware components. More accurate pricing wouldnot have significantly impacted the Interactive Report Card grades(Aruba likely would have finished third rather than tied for second withCisco). We'll leave it to our readers to pass further judgment.

Beyond factual errors, there is room for legitimate disagreement aboutother points. For example, Trapeze was penalized for lack of WPA2certification (it was in process during the review), even though itsproduct passed our WPA2 test. Maybe we were too tough on that one, butreaders have told us that Wi-Fi Alliance certification, with all itswarts, is very important to product selection and we felt it would havebeen unfair not to reward vendors that had completed the certificationprocess. Likewise, Aruba felt we didn't give it enough credit for itsintegrated firewall feature. We just didn't feel it warranted as muchrecognition as the company did. As a prospective customer, if you feelthat's the killer feature, you should seriously consider Aruba'soffering. That's what Frank said in his analysis.

Behind the Scenes

The really interesting reactions often come behind the scenes. Theyoften come through private documents distributed to internal employeesand, sometimes, we suspect, via planted postings on Web sites. In aninternal document filled with errors forwarded to me by one of its salesprospects, Trapeze blasted us for including VPN support as one of manyelements in our security grade while also noting that the company plansto add that capability in a future release.

We saw one Web posting suggesting Airespace had bought the win by hiringa former graduate student who had worked on a previous review in ourlab. Another post questioned Frank Bulk's technical competence, anattack that would be laughable if it wasn't so sad. If Frank isincompetent, then so too are the vast majority of network professionalsto whom these products are being sold. Attempts to follow-up with theseposters by e-mail resulted in no response. Judge for yourself.In the end, Network Computing strongly stands by our analysis ofwireless switches and also by our recommendation that all the productswe reviewed are of sufficient quality to be actively considered by everyserious network planner. Like the children of Lake Woebegone, they'reall above average. Our review represents a snapshot in time, onepublication's six-month-long quest for better understanding of some verycomplex technology. Determining which of many products is the best foryour environment will likely require at least as much effort on yourpart.

Dave Molta is Network Computing's senior technology editor. Write to him at [email protected]

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox
More Insights