Last week, as I was getting ready to leave the office for the day, my phone rang. You know the routine: You look at the display to see whose number shows up so you can decide whether to answer it. In this case, I recognized a colleague whom I enjoy chatting with, and so I opted to take the call. “Hi, Paula,” I said. And then she unknowingly opened a can of Oh Jeeze, Please Shut Up. “Lee, what should I tell people to expect for speeds on both our wired and wireless networks?”
The wired part of the question was easy. She has Gigabit and Fast Ethernet ports in her area, and as long as someone didn’t sneak in a mini-hub and all NICs behave, then pretty much all connections would be somewhere near 100 M or 1,000 Mbps, as appropriate. But the wireless half of the inquiry? I started by saying, “Paula, how much time do you have for this?”
I run a big network. We accommodate all sorts of wireless client devices, but they break down roughly along the lines of some 50% Windows users, about 35% Mac OS X, and 15% Linux, iOS, Android, and whatever else pops up. Out of almost 3,000 access points, slightly more than half are running dual-band 802.11n, while the balance are 802.11a/g (until they upgrade in the near future). About 50% of all of our clients are 802.11g only, while the rest land in 5 GHz range, either on 802.11a or the high band of 11n. Most of our locations are 100% covered with capacity-driven design, but there are areas where simply having a good signal present from a single AP at whatever data rates can be achieved is suitable.
After laying all of this out for Paula, I continued to describe how even seemingly alike clients can behave differently on the same access point. (I have one Mac that does a 300-Mbps data rate, but another that will only do 278 rate because of differences in the radio card dealing with something called Guard Interval.) Also influencing actual throughput, despite seeing stellar data rates of 300 Mbps for 11n or 54 Mbps for 11a/g in your wireless utility, are the specific access point a user hits, what band the clients go to and what other clients on the same access point are doing. Paula is a patient, intelligent person, but I could hear her eyes start to glaze over.
Because it was the end of the day, I spared her the diatribe on how our Cisco CleanAir feature set reveals hundreds of fleeting, transient interference sources (Bluetooth, microwave ovens, gaming consoles, ad hoc clients, etc.) every day that tend to hit and fade, but likely have some impact on local wireless connectivity while they persist. Whether the degradation is perceivable or not is a whole other discussion. She already knew the value of up-to-date drivers, so we skipped that one. I pointed out that even when things are perfect, we still have a per-user cap at our Internet edge that influences perceived connection speed, and that in-house speed tests would be faster than those done off network. I even touched on the code bug we have on one model of controller that cuts our 11n speeds to roughly half of what they should be in one direction.
But she refused to be bored to sleep. “OK, so all that being said, what can I tell clients that they should be getting on wireless?" she said. "At least 10 Mbps in general? Just a good-feeling connection? Give me something, anything simple, to tell people.”
How would you answer Paula?