How to Perform Application Baselining or Profiling: Part 1
Whenever I mention application baselining or profiling, I get quite a few emails asking, "How do you do that?" and "Is there a standard template you use?" Spoiler alert, unfortunately, there isn't a standard template or standard way of performing a baseline. So I thought it would be a good idea to create a series of articles to walk you through how I do it.
It starts with the setup. I need to know the behavior and performance of the equipment I’m using. I decided to use some standard computers and not traffic generators. The goal is to show you the methodology that you can use with whatever you happen to have at work, not purchase more equipment. I find that if a client learns how to do something, they are more likely to purchase a commercial product and understand how products in that space work.
I chose a desktop and laptop for my tests for the reasons mentioned earlier. The first thing I need to do is determine the performance of the computers in case I decide to use them for throughput testing as well as application testing. Another reason for choosing a laptop is that in the real world I would use a laptop to test WiFi and remote networks back to a wired desktop.
I used iperf3 to get a measurement within the computers as well across my network. My goal is to ensure that the devices can generate over 1 Gbps internally before I introduce them to the network. It’s important to test upload (transmit) as well as download (receive). I’ve seen many cases where the download is much higher than uploads. It’s not I’m going to go into troubleshooting mode or anything, but just important to know moving forward.
From the video, you will see that the desktop got 5.96 Gbps upload and 9.93 Gbps download. The key here is to take multiple measurements. Before starting any tests, I try to disable any applications or processes that I've seen affect throughput. In this scenario, I disabled the firewall and real-time antivirus scanning. I am not sure if these processes will affect anything, but I thought I would play it safe. I also checked the task manager to ensure that the CPU wasn't taxed and memory usage was reasonable. In the video, you will see that the CPU was 7 percent and memory usage is 28 percent. When checking system resources don’t take a snapshot, watch it over a few minutes since processes could pop on anytime.
In some videos, you will see me take five measurements and drop the high and low, then averaging the remaining three. In this case, I used all five since the results were relatively close. The laptop got 6.55 Gbps upload and 6.53 Gbps download.
Now that I have a local baseline, I introduce the network. In previous videos, I start with a cable between the two devices. In this lab, I started with two computers connected to a switch. I preferred this method because my switch is a DHCP server and saved me some configuration work.
The network test showed me that the desktop upload was 878 Mbps and download 930 Mbps where the laptop got 933 Mbps Upload and 893 Mbps Download.
Now that I know that computers maximum throughput and that they work well I can proceed with my testing and application profiling.
In the next articles, I will be reviewing and testing some of the most common protocols we run into out there.
Recommended For You
To meet increasing consumer and enterprise demand for data (especially from video streaming and IoT devices), data centers must upgrade their infrastructure.
Cloud-based DCIM offers simplified installation and delivers smart and actionable insights for the optimization of any server room, wiring closet, or IT facility.
Unless you’re a startup with a single app that integrates a continuous deliver/ deployment pipeline, buy the base and build what matters: policy and pipeline.
DevOps adoption is growing - and facing some growing pains, including a fair share of cultural issues.
Each of these concepts boils down to finding ways to ingest and manage your data in an effective way for today’s level of insight-driven decision-making.