Akamai to Deploy Advanced AI Across its Global Edge Network

With partner Neural Magic's software, potential user benefits may include lower latency, higher service levels, and faster response times.

2 Min Read
Akamai to Deploy Advanced AI Across its Global Edge Network
(Credit: Denis Putilov / Alamy Stock Photo)

Cloud and content delivery network Akamai Technologies last week teamed with Neural Magic to deploy advanced artificial intelligence (AI) software across its global edge server network.

The duo's efforts could provide businesses with lower latency, higher service levels, and faster response times. Adding the software could enable use cases such as AI inference, immersive retail, and spatial computing.

Long ago, Akamai built a distributed global network comprised of edge servers containing cached content located close to users to cut the time and boost the performance of delivering rich media such as streaming video. Now, the provider is using the same network to provide Neural Magic's AI much closer to the sources of user data.

The company said it intends to “supercharge” its deep learning capabilities by leveraging Neural Magic’s software, which enables AI workloads to be run more efficiently on traditional central processing unit-based servers, as opposed to more advanced hardware powered by graphics processing units (GPUs).

Potential Business Benefits of AI at the Edge

One expert sees several potential benefits to using Akamai’s content delivery network (CDN) business customers with Neural Magic’s AI acceleration software.

“This could potentially lower the cost of service and still meet the requirements for the AI workloads,” said Baron Fung, Senior Research Director at Dell’Oro Group, a global telecom market research and consulting firm. “Lower cost can be achieved because the service provider (Akamai) can use general-purpose servers that uses traditional infrastructure, rather than expensive dedicated AI/GPU servers and infrastructure.”

Potential applications benefits are possible "because these nodes are situated at the network edge, close to where the user or machines are located, faster response time of applications for customers could be realized, especially for workloads that are AI related."

Higher service levels could be attained. “Because of the scalable nature of the solution, new CDN nodes suitable for AI workloads could be scaled quickly in high-demand regions.”

Running AI Workloads Close to Data Sources

In February, Akamai launched its Generalized Edge Compute (GECKO) initiative which focused on embedding cloud computing capabilities in the provider’s massive edge network. The initiative will efficiently support modern applications and workloads, wrote Zacks Investment Research. “These workloads will span a wide range of next generation use cases such as AI inference, immersive retail, and spatial computing.”

Related articles:

About the Author

Bob Wallace, Featured Writer

A veteran business and technology journalist, Bob Wallace has covered networking, telecom, and video strategies for global media outlets such as International Data Group and United Business Media. He has specialized in identifying and analyzing trends in enterprise and service provider use of enabling technologies. Most recently, Bob has focused on developments at the intersection of technology and sports. A native of Massachusetts, he lives in Ashland and can be reached at[email protected]or @fastforwardbob

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights