Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

How New Chip Innovations Will Drive IT

DPU
(Credit: Edelweiss via Adobe Stock)

Fifty-eight years ago, Fairchild Semiconductor’s Gordon Moore, who later co-founded Intel, posited Moore’s Law. Moore’s Law predicted that the number of transistors in an integrated circuit would double every two years. In 1965, when the prediction was first made, this seemed daunting. But the bar that it set has driven the semiconductor industry ever since, and the continued growth of chip capacities has spearheaded continuous advances in computing and IT.

CIOs keep their ears to the ground when it comes to semiconductor advances because they drive advances in hardware, software, and new business capabilities that IT leaders must plan and train for.

In 2023, that planning and training exercise is no different than in years before. New chip designs that are just coming online will drive a new round of IT innovations that must be penciled into IT roadmaps.

Here are some of the latest chip innovations and what they portend for IT.

Faster Parallel Processing in the Data Center

Enterprise demand for data processing units (DPUs) is on the rise, and for good reason. More IT workloads are shifting into analytics and big data processing, which requires parallel computing that can operate on many different streams of data simultaneously.

What the DPU does is offload CPU capabilities from a single, central CPU into many different “mini-CPUs” that are embedded in DPU circuits. Each DPU contains a CPU, a network interface controller (NIC) and programmable data engines. The DPU is designed to efficiently process network packets, storage requests, and analytics requests. Its data acceleration engines are tailored for the parallel processing of data, a prerequisite if you are going to process a lot of unstructured, big data.

As more DPUs are installed in data centers, CPU loads will become more distributed, and they will parallel-process. The impact on the data center is that more data, especially data that is big and unstructured, can be processed. DPUs will deliver new data processing capacities to data centers, but they will also force a rearchitecting of data center computing, which will become more decentralized, even within the data center itself.

From a hardware, software and a budgetary standpoint, the onramp of DPUs is likely to force upgrades that must be accounted for in IT budgets, along with explanations as to why upgrades are needed. IT operations personnel are likely to require training so they can run and maintain parallel-processing architectures. The DPU evolution won’t be lost on IT vendors, either. More software package vendors are likely to add enhanced analytics and even big data processing to their offerings.

Read the rest of this article on InformationWeek.

Related articles: