Networking

04:32 PM
Commentary
Commentary
Commentary
50%
50%

The Physical-to-Virtual Cookbook Part 1

Migrating applications from a physical to a virtual environment is no easy task. This multipart series presents a real-world migration project. Part 1 looks at the client’s environment, examines the rationale for the migration, and walks through efforts to inventory the data center and map application dependencies.

Before embarking on any actual migration, you must understand the applications and physical devices you have in your data center. This is easier said than done for many organizations. You must understand the complete logical and physical application topology before you touch anything, especially complex, multitier applications.

We used a set of application dependency mapping tools at our client's site, including an appliance that was connected to the network. A network administrator configured port mirroring to the appliance to collect and store transaction data for us to examine. We used the OPNET Response Xpert appliance in conjunction with the OPNET appMapper Xpert application. (OPNET was recently acquired by Riverbed.)

We ran the appliance for about one week and collected interconnect information across 100 applications. In addition to server-to-server connectivity, the appliance also mapped details such application connections at the port level. This gave us a complete picture of the client's data center environment. We also discovered some custom applications that didn't show up in the client's software inventory spreadsheets.

With our dependency data in hand, we then needed performance metrics for the applications. This would allow us to accurately size the virtual servers that would be used in the migration. Our client already had the open-source Paessler PRTG Network Monitor installed. This tool collected CPU, memory, I/O throughput and disk statistics. Luckily, this tool had been running for 12 months, so we had a good baseline of metrics from which to draw.

This isn't the case with everyone; in many of our clients' environments we need to install agents on the applications to collect performance metrics, and then allow at least 30 to 45 days of data to capture baseline performance data. Agent installation can be time-consuming and expensive, particularly if you only plan to use agents during a migration, so consider open-source options like CACTI or Helios if you do not regularly monitor your application environment.

In a typical P2V migration you should analyze your existing hardware, but we were confident that the client's physical inventory was solid and that it had a good record of the type of hardware, CPU with socket count, core count, amount of memory and local storage. However, even with a detailed inventory and IP addresses for servers and applications, we still had to physically find these servers in the data center.

In addition to gathering technical details, we also decided to send out application questionnaires and conduct interviews with some of the key stakeholders to determine if there were areas for improvement. IT pros rightly focus on improving efficiency and lowering costs, but it's also important to check in with users to see if their service expectations are being met and find out if they are happy with the applications provided by IT.

We found the questionnaire to be an effective way to communicate with the client's key users about the migration and to verify the information we'd collected. We also discussed future growth plans and organizational needs that would increase load on the client's infrastructure, and looked into applications that were no longer supported or being phased out. The questionnaires and interviews also helped to bring people on board by making them part of the process, which is always good when interjecting change.

In Part 2 of this series, we'll delve more deeply into application dependency mapping and gathering performance metrics.

Don Magrogan is CTO of Fusion PPT, a cloud computing strategy and technology solution firm.

Previous
2 of 2
Next
Comment  | 
Print  | 
More Insights
Hot Topics
13
Open Source Vs. Open Enough
Bob Laliberte, ESG senior analyst,  7/18/2014
5
Do We Need 25 GbE & 50 GbE?
Jim O'Reilly, Consultant,  7/18/2014
5
Guide: The Open Compute Project and Your Data Center
James M. Connolly, Editor in Chief, The Enterprise Cloud Site,  7/21/2014
White Papers
Register for Network Computing Newsletters
Cartoon
Current Issue
2014 Private Cloud Survey
2014 Private Cloud Survey
Respondents are on a roll: 53% brought their private clouds from concept to production in less than one year, and 60% ­extend their clouds across multiple datacenters. But expertise is scarce, with 51% saying acquiring skilled employees is a roadblock.
Video
Slideshows
Twitter Feed