Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Data Transformation and “Tool Sprawl”

data transformation
(Source: Pixabay)

Most network engineers have, at some point, had to grind through a seemingly endless backlog of network change requests that force them to dart across multiple network technologies ( CLI devices, API-enabled controllers, public cloud, etc.) and then make sure adjacent tools are updated to reflect each change. To make these processes less onerous, some enterprise networking teams may jump (albeit tentatively) into network automation, first by writing a few scripts to handle common tasks. From there, they often go after infrastructure automation, trying to agree on a source of truth and building processes to keep things up to date. Then they realize they could make a few other tasks self-service to improve delivery times. Before they know it, instead of simplifying their processes, these well-intentioned networking teams are suffering from "tool sprawl."

While “tool sprawl” may sound nasty (and possibly contagious), it’s increasingly common and can be both diagnosed and alleviated if caught and managed early…

Data transformation benefits the automation journey

Enterprise network teams know automation can help, but how do they do it?

One reason automating "the network" has become so complex is because today's network infrastructure consists of several different types of networks, such as physical, cloud, virtual, etc., all of which have different methods of management (CLI, API, GUI Dashboards). There are also multiple IT systems, network services, and applications that must also be utilized as part of the automation journey, adding more complexity.

The automation process starts with identifying a network automation platform that is robust, flexible, accessible to teams of all skill sets, and doesn't insist you throw out your existing scripts and play books. Instead, the best platforms allow teams to bring all of the disparate parts of the network under a common, aligned automation model. Such a platform should lead an organization to Scalable Enterprise Network Automation -- being able to start with simple task-based scripts and move toward more scalable network automation, addressing a larger amount of activities within the automations on each iteration. This will free networking teams to focus on larger business initiatives.

By starting with simple network changes and evolving to a more complex journey of integrating various systems, messaging, and other sources of truth, the number of automation tasks grows, and the need to automate the data transformation processes becomes more of a necessity.

Data transformation is the process of taking data in from a single source, or multiple sources, transforming it into an appropriate format, and sending the data out to be used by another step in a workflow. Network teams are mainly doing this manually (read: inefficiently) by swivel-chairing between systems and dashboards to format and copy and paste data as necessary. 

Don’t get lost in translation

When connecting systems and integrating data, a commonly overlooked aspect is the level of translation that needs to occur. While some transformations can be straightforward, there are some that require a greater level of translation – manipulating the information to the point where it makes sense for the customer – and not just the ability to deliver the payload.

Let's use the post office as an analogy. The post office can deliver a letter from one person to another. But what if that letter is written in Mandarin and the person receiving it can only read English? The post office was able to deliver the letter – it provided the integration function of connecting one person to the other – but now, some level of translation needs to happen. In network automation, that's what the data transformation function accomplishes. The payload has been delivered, and now the data transformation function needs to manipulate the data to make it readable and consumable at the other end.

Of course, it isn't quite that simple. The more sources of data you have, the more complicated the process becomes since you're translating between technologies, which vary tremendously (i.e., cloud technologies vs. traditional hardware). And while network teams can solve translation challenges by writing a script, a function, or a piece of software that is going to perform text manipulation, other challenges lie in wait, including the number of tools being used.

Because teams have to retrieve and manipulate different types of information available in different places, a different set of tools is required for each system. In other words, tool sprawl: the unnecessary buying of new tools, inefficient spending, and disparate data.

Reducing tool sprawl via automated data transformation

DevOps engineers understand the burden of writing and maintaining code when integrating with different IT systems. Automating the most basic of IT services can sometimes require integration with the ITSM system, sources of truth, compute, storage, and multiple networking systems. Writing the code to integrate and manipulate data between these systems is a tremendous amount of work, and because the IT ecosystem continues to evolve and change over time (i.e., new systems are added and removed, APIs change, etc.) the task of managing and updating integration code for every automation becomes monumental.

Automating data transformation, by utilizing modern tools to develop transformation rules as objects that can be referenced by the automations can help reduce tool sprawl. And it can do it in a way that is flexible and portable while also offering time savings, reduction in manual errors, and increased efficiency. Data transformations that can be built fast and operate as reusable assets enable network teams to start faster from the beginning and not start from scratch with every automation. In addition to automating the task itself, the automation solution must include automating data transformations. Otherwise, it will always require human intervention to format data between network tasks, leading to inefficiencies, increased costs, and errors.

When data transformation is performed correctly, combined with using only the necessary tools, NetOps and DevOps teams can access data efficiently, reduce manual errors and time, and extract insights for future automations.

Morgan Stern is VP of Automation Strategy at Itential.

Related articles: