10 Years Of Wi-Fi: Lessons Learned

It's been about 10 years since I tested the first batch of 802.11 products for Network Computing. A lot has happened with Wi-Fi since then, almost all of it good. As a grizzled veteran of the broadband wireless industry, I reflect on some of the lessons I've learned.

Dave Molta

August 14, 2009

7 Min Read
Network Computing logo

It's been about 10 years since I tested the first first batch of 802.11 wireless products for Network Computing. It was very cool technology, to be sure, and it had obvious long-term potential. Unfortunately, it was slow (throughput of just over 1 Mbps); it was expensive (PC-Card wireless NIC's cost several hundred dollars each); and, interoperability pretty much sucked. Part of the interoperability issues related to compromises in the standards-development process, where vendors could implement either of 2 different spread spectrum radio technologies (frequency hopping and direct sequence) and still call it 802.11. A lot has happened with Wi-Fi since then, almost all of it good.

As a grizzled veteran of the broadband wireless industry, it seemed appropriate to write a retrospective piece for my first blog posting on this site. If you've looked at my bio, you'll know that I have divided my time over the past 10 years, working as a faculty member and Assistant Dean at Syracuse University's School of Information Studies and also as an editor with Network Computing. As I've thought about my work over the years, I compiled the following list of lessons learned. I hope it stimulates some feedback about where we've been and where we're going. For this posting, I'll focus on Wi-Fi. In the future, I'll address other sectors of the wireless industry.

1. If a vendor promises seamless interoperability, show them the door. In the early days of 802.11, interoperability was hit or miss at best. The emergence of the Wi-Fi Alliance's product certification program was a big step in the right direction, but getting Wi-Fi products from multiple vendors to work together can still be a headache, especially when new releases break things that used to work. There are still plenty of seams to overcome. A great example is Wi-Fi support on version 3.0 of Apple's iPhone operating system. Never think of interoperability in terms of black and white. It's always grey.

2. Never trust the .0 release. Over the years, I've learned a lot about the sausage-making process employed by technology vendors, and it isn't always pretty. Vendors are under immense pressure to get products out the door, to beat competitors to the punch, to meet artificial marketing deadlines, to stimulate purchases from customers who have postponed purchases waiting for the next release. Unfortunately, these .0 releases are almost always glorified beta releases that cause problems for IT professionals. Unfortunately, waiting for releases that are stable not only prevents you from taking advantage of new capabilities, it can also prove to be an unfulfilled dream, with a different set of problems found in the maintenance release.

3. Wireless performance is a quantitative quagmire. In my earlier days of managing Ethernet networks, benchmarking performance and provisioning networks was pretty easy. Performance tests were easily repeatable and results in the lab correlated highly with experiences in the field. With Wi-Fi, the variables impacting performance are impossible to accurately model, despite efforts by a number of vendors that have developed extremely sophisticated tools. Worse, the tried and true Ethernet method of over-provisioning (throwing bandwidth at the problem) isn't always practical with Wi-Fi. It's true that 802.11n is moving us in that direction, with improved raw performance and better consistency of coverage. Unfortunately, the transition to 11n will take a few years as older products age out of production. In the mean time, most shops will need to trade performance for compatibility with legacy clients.4. Spectrum is limited, so use as much as you can. In an era of immense distrust in government, lets give credit where credit is due. The decision of government policy makers to designate portions of the electromagnetic spectrum for use by unlicensed wireless devices stimulated the development of the Wi-Fi technologies we use today. Unfortunately, radio spectrum with the best propagation characteristics is extremely valuable, so we don't get so much of it for Wi-Fi. Even during the early days of wireless, I strongly encouraged enterprises to deploy dual-band (2.4 GHz and 5 GHz) products. Unfortunately, far too many network designers saw this as a choice of one or the other, and 2.4 GHz usually won. That's still true to a certain extent today, but most enterprises have gotten the message. The only way to scale Wi-Fi performance in the enterprise is to use all available spectrum.

5. Interference is a design reality. I remember the early days of Ethernet networks when media-related problems were often at the root of performance and reliability problems. Those problems were pretty much solved as UTP and fiber interconnect technology improved. But interference-related media challenges on Wi-Fi networks will never go away, despite vendor efforts to automate channels and power levels. Interference from other networks, within and outside your facility, is still a big issue. Using 5 GHz spectrum is currently the best way to combat that problem, but as usage increases, interference will still need to be mitigated. And then there's the worst case scenario, physical-layer RF jamming attacks. These denial of service attacks cannot be prevented. The best hope is quick identification and mitigation.

6. Wi-Fi is a LAN, not a WAN technology. From the beginning, I was skeptical about the notion of metro area Wi-Fi networks. There were lots non-technical reasons I never bought into this movement, including policy considerations about government ownership and competition as well as untenable business plans. But mostly, my concerns were technical in nature. The 802.11 standard was designed as a contention-based protocol for small-cell LANs. While it can be made to work across wide areas, doing so introduces a range of design and operational challenges and often requires extremely high density of AP's. It was clear from the beginning that vendors weren't being honest about this and many communities were sold on promises that didn't pan out. That doesn't mean Wi-Fi isn't effective in hot-spots, like retailers or even hot-zones, like public parks. It's just not reliable enough for wide area deployments.

7. Wireless Security is all about risk and reward. When 802.11 products first hit the market, I wrote about how WEP was broken. At that time, the encryption algorithms in WEP hadn't yet been cracked, but the shared-key architecture simply wasn't viable for enterprise deployments. Give Cisco credit for implementing the first dynamic 802.11 security system based on 802.1X, EAP, and RADIUS, technology that eventually was standardized in 802.11i. It's a great security architecture, but it doesn't meet all needs. In particular, most enterprises find it increasingly necessary to accommodate guest access to their WLAN. When implementing guest access, network segmentation is almost always a better strategy. Yes, there is some additional risk, but that's the price we sometimes have to pay in order to make a network easy to use.

8. Most Wi-Fi users need nomadic rather than mobile services. Since Wi-Fi came to market at a time when cell phone adoption was rapidly increasing, it's no surprise that mobility has always been a focus of product engineering efforts. Vendors have long touted their products' ability to provide seamless mobility on enterprise networks, a claim that has almost always had a few asterisks associated with it. But the reality is that WLAN adoption has largely been an issue of convenience, providing users with nomadic data services as they carry their laptop computers (and more recently, Wi-Fi enabled smartphones) from office to conference room to home.  9. There is no killer app for Wi-Fi. Since the early days of Wi-Fi, vendors have searched for the killer app that would cost-justify WLAN implementations. But wireless VoIP and location-based services, two of the more prominent killer apps advanced by vendors, have realized modest uptake. Instead, it is the freedom afforded to laptop computer users that has primed the pump for widespread WLAN adoption. With this need driving infrastructure rollouts, a gradual approach that provides wireless access where needed is a rational transition towards some future day when IP-based voice and location-based applications are mainstream.

10. Wi-Fi ROI models don't deliver simple answers. IT professionals feeling the pinch of stagnant or shrinking budgets have to make tough choices about which systems to deploy. Modeling return on investment is a classic approach to informing IT management decision-making. However, ROI models for enterprise WLAN deployment have never been reliable. Assumptions of increased productivity based on one's ability to stay connected to the network ignore many key dynamics that contribute to organizational effectiveness. Yes, having wireless access during a meeting may enhance your ability to process e-mail, but the adverse impact on group dynamics, especially in meetings where interaction is paramount, is easily recognized by nearly everyone involved.

Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like

More Insights