Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Akamai to Deploy Advanced AI Across its Global Edge Network

Edge AI
(Credit: Denis Putilov / Alamy Stock Photo)

Cloud and content delivery network Akamai Technologies last week teamed with Neural Magic to deploy advanced artificial intelligence (AI) software across its global edge server network.

The duo's efforts could provide businesses with lower latency, higher service levels, and faster response times. Adding the software could enable use cases such as AI inference, immersive retail, and spatial computing.

Long ago, Akamai built a distributed global network comprised of edge servers containing cached content located close to users to cut the time and boost the performance of delivering rich media such as streaming video. Now, the provider is using the same network to provide Neural Magic's AI much closer to the sources of user data.

The company said it intends to “supercharge” its deep learning capabilities by leveraging Neural Magic’s software, which enables AI workloads to be run more efficiently on traditional central processing unit-based servers, as opposed to more advanced hardware powered by graphics processing units (GPUs).

Potential Business Benefits of AI at the Edge

One expert sees several potential benefits to using Akamai’s content delivery network (CDN) business customers with Neural Magic’s AI acceleration software.

“This could potentially lower the cost of service and still meet the requirements for the AI workloads,” said Baron Fung, Senior Research Director at Dell’Oro Group, a global telecom market research and consulting firm. “Lower cost can be achieved because the service provider (Akamai) can use general-purpose servers that uses traditional infrastructure, rather than expensive dedicated AI/GPU servers and infrastructure.”

Potential applications benefits are possible "because these nodes are situated at the network edge, close to where the user or machines are located, faster response time of applications for customers could be realized, especially for workloads that are AI related."

Higher service levels could be attained. “Because of the scalable nature of the solution, new CDN nodes suitable for AI workloads could be scaled quickly in high-demand regions.”

Running AI Workloads Close to Data Sources

In February, Akamai launched its Generalized Edge Compute (GECKO) initiative which focused on embedding cloud computing capabilities in the provider’s massive edge network. The initiative will efficiently support modern applications and workloads, wrote Zacks Investment Research. “These workloads will span a wide range of next generation use cases such as AI inference, immersive retail, and spatial computing.”

Related articles: