Network Performance: Packet Loss Vs. Delay
In this video, Tony Fortunato demonstrates how to take some basic latency measurements.
February 22, 2017
I was troubleshooting a network performance problem with a client while teaching a class and we got into a discussion of what is worse, packet delay or packet loss. As a consultant, I argued the standard "it depends." Really, it does depend on many factors, including network latency and application or protocol behavior.
I suggested taking some basic network latency measurements with a network emulator and see what the impact is. Some complained this would take too long, but I said it would easily take less than an hour to figure out.
Using an Apposite Linktropy Mini2 WAN Emulator, I set up a quick lab and we got some results within a half an hour. We used iPerf as an easy way to test, but other options are readily available. Our first test was to establish a baseline since I wasn’t familiar with their laptops or switch, so we set the delay to 1 millisecond and no packet loss. This resulted in over 9.5 Mbps upload and download.
The second test was configured with 33 ms delay and no packet loss, which resulted in over 9 Mbps upload and 1.7 Mbps download.
Our last test covered their worst case scenario of 10% packet loss and no delay; performance dropped to approximately 1 Mbps upload and 750 kbps download.
In their specific environment, we saw that packet loss had a more negative impact than delay.
In summary, I can’t emphasize enough the importance of running your applications through a network emulator to see how it behaves with different network conditions. It's important to note that the emulator I used allowed me to run the application and not a simulation that predicts how the protocol or application will behave.
About the Author
You May Also Like