Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

From AI to 5G, How Global Trends Will Transform Data Centers in 2021

Data center

Over the past three years, the number of data centers worldwide has shrunk from to 7.2 million from 8.6, according to industry tracker site Statista - and there are only a few hundred hyperscale centers operated by the likes of Facebook, Microsoft, and Amazon.

As the Bob Dylan song goes, however, the times, they are a-changin. In 2021, expect to see the floodgates open on data center growth as businesses, startups, and industries like healthcare and retail require more processing power at the edge, in the cloud, and everywhere in between as they move to meet challenges brought on by increased competitive need for big data and AI to boost revenue.

Servers, racks, and cooling units will start to look vastly different thanks to GPU- and DPU-accelerated computing, smart NICs, and AI-enabled software. Even the definition of a data center will change, as powerful on-premises systems and autonomous cars handle AI tasks that require near-instantaneous outcomes where people are.

Open networking will become more valuable as the trillions of terabytes of data generated by consumers and businesses grows exponentially thanks to an increase in IoT devices and to the COVID-19 epidemic that has accelerated online activity, from shopping to remote work.

Here are some of the key trends we expect in 2021:

Accelerating Change in the Data Center: Accelerated applications will be offloaded from CPUs into GPUs, and SmartNICs based on programmable data processing units (DPUs) will accelerate data center infrastructure services such as networking, storage, and security. The addition of these GPUs and DPUs will deliver expanded application acceleration to all enterprise workloads and provide an extra layer of security. Virtualization and scalability will be faster, while CPUs will be freed up to run traditional apps faster and offer accelerated services.

The new data center architecture will leverage software-defined, hardware-accelerated virtualization, which provides manageability, security, and flexibility. It will also support containers, which ease the adoption and management of AI frameworks.

A DPU is a new class of programmable processor that combines three key elements. A DPU is a system on a chip, or system on a chip (SOC), that combines:

  • An industry-standard, high-performance, software programmable, multi-core CPU, typically based on the widely-used Arm architecture, tightly coupled to the other SOC components
  • A high-performance network interface capable of parsing, processing, and efficiently transferring data at line rate to GPUs and CPUs
  • A rich set of flexible and programmable acceleration engines that offload and improve applications performance for AI and Machine Learning, security, telecommunications, and storage, among others.

All these DPU capabilities are critical to enable isolated, bare-metal, cloud-native computing that will define the next generation of cloud-scale computing.

Indsutry efforts are working to deliver an end-to-end enterprise platform for AI. Such a platform will integrate AI software, making it easier to deploy and manage AI. Every industry from financial services, healthcare, and manufacturing will be able to deploy AI workloads using containers and virtual machines on the same platform.

End-user spending on global data center infrastructure is projected to climb to $200 billion in 2021, up 6 percent from 2020, according to the latest forecast from Gartner. GPUs and DPUs will be a part of this data center infrastructure and will result in significant performance, efficiency, and security improvements.

AI as a Service: Companies that are reluctant to spend time and resources investing in AI, whether for financial reasons or otherwise, will begin turning to third-party providers to achieve rapid time to market. The broad ecosystem of AI companies developing on the NVIDIA CUDA framework and AI platforms will become key partners by providing access to software, infrastructure, and solutions.

Transformational 5G: Companies will begin defining what "the edge" is. Autonomous driving is essentially a data center in the car, allowing AI to make instantaneous decisions while also sending data back for model training that improves will improve the in-car inference decisions. Similarly, with robots in the warehouse and the workplace, there will be inference learning at the edge and training in the core. Just like 4G spawned a transformational change in transportation with Lyft and Uber, 5G will bring transformational new capabilities and business opportunities. It won't happen all at once, but you'll start to see the beginnings of companies seeking to take advantage of the confluence of AI, 5G, and new computing platforms. The important attribute of these GPU and DPU AI accelerated platforms is that they are fully software-defined. This allows businesses to quickly adapt to evolving technologies and rapidly deploy new services and business models.

Hybrid Cloud: In addition to moving certain workloads into the public cloud, companies are also designing their own private cloud in an on-premises data center, controlling a dedicated private network and virtualized infrastructure. As a result, modern data centers are expected to provide features such as low-latency networking, built-in virtualization and container platforms, or even native support for databases and other advanced applications.

These software-defined, hardware-accelerated stacks will take advantage of DPUs to accelerate networking, storage, security, and management applications.

DPUs are an essential element of modern data centers. In this model, the data center is the new unit of computing in which CPUs, GPUs, and DPUs combine into a single computing unit that's fully programmable, AI-enabled, and can deliver greater levels of security, performance, and efficiency.

This transforms the entire data center into a massive software-programmable unit of computing that can be provisioned and operated as a service. The modern data center can then be used not only for high-performance computing and deep learning but for data analytics, remote workstations, and application virtualization.

“Someday, trillions of computers running AI will create a new internet — the internet-of-things — thousands of times bigger than today’s internet-of-people,’ says Jensen Huang, NVIDIA’s CEO. 

It’s already begun.