Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

AI in Data Centers: Increasing Power Efficiency with GaN

GaN
(Credit: Image Source / Alamy Stock Photo)

The digital world is undergoing a massive transformation powered by the convergence of two major trends: an insatiable demand for real-time insights from data and the rapid advancement of Generative artificial intelligence (AI). Leaders like Amazon, Microsoft, and Google are in a high-stakes race to deploy Generative AI to drive innovation. Bloomberg Intelligence predicts that the Generative AI market will grow at a staggering 42% year over year in the next decade, from $40 billion in 2022 to $1.3 trillion.  

Meanwhile, this computational force is creating a massive surge in energy demand—posing serious consequences for today’s data center operators. Current power conversion and distribution technologies in the data center can't handle the increase in demand posed by the cloud and machine learning—and certainly not from power-hungry Generative AI applications. The quest for innovative data center solutions has never been more critical.

Gallium Nitride (GaN) semiconductors emerge as a pivotal solution to data center power concerns, helping counter the impact of Generative AI challenges. We dive into how Generative AI affects data centers, the advantages of GaN, and a prevailing industry perception of the Power Usage Effectiveness (PUE) metric—which is creating headwinds despite GaN's robust adoption. With Generative AI intensifying power demands, swift measures are essential to reshape this perception and propel GaN adoption even further.

The rising impact of Generative AI on the data center

Today’s data center infrastructure, designed for conventional workloads, is already strained to its limits. Meanwhile, the volume of data across the world doubles in size every two years—and the data center servers that store this ever-expanding information require vast amounts of energy and water to operate. McKinsey projects that the U.S. alone will see 39 gigawatts of new data center demand, about 32 million homes’ worth, over the next five years.

The energy-intensive nature of generative AI is compounding the data center power predicament. According to one research article, the recent class of generative AI models requires a ten to a hundred-fold increase in computing power to train models over the previous generation. Generative AI applications create significant demand for computing power in two phases: training the large language models (LLMs) that form the core of generative AI systems and then operating the application with these trained LLMs.

If you consider that a single Google search has the potential to power a 100W lightbulb for 11 seconds, it’s mind-boggling to think that one ChatGPT AI session consumes 50 to 100 times more energy than a similar Google search. Data centers are not prepared to handle this incredible surge in energy consumption. One CEO estimates that $1 trillion will be spent over the next four years upgrading data centers for AI.

Unfortunately, while technologies like immersion cooling, AI-driven optimizations, and waste heat utilization have emerged, they offer only partial solutions to the problem. A critical need exists for power solutions that combine high efficiency, compact form factors, and deliver substantial power outputs. Power electronics based on silicon are inefficient, requiring data centers to employ cooling systems to maintain safe temperatures.

GaN: Unparalleled performance and efficiency

GaN offers unparalleled performance and efficiency compared to traditional power supply designs, making it an ideal option for today’s data centers—particularly as Generative AI usage escalates. GaN transistors can operate at faster switching speeds and have superior input and output figures-of-merit. These features translate into system benefits, including higher operating efficiency, exceeding Titanium, and increased power density.

GaN transistors enable data center power electronics to achieve higher efficiency levels—curbing energy waste and generating significantly less heat. The impact is impressive. In a typical data center environment, each cluster of ten racks powered by GaN transistors can result in a yearly profit increase of $3 million, a reduction of 100 metric tons of CO2 emissions annually, and a decrease in OPEX expenses by $13,000 per year. These benefits will only increase as the power demands of Generative AI increase and rack power density rises 2-3X.

While the benefits of GaN are profound, why aren’t even more data center operators swiftly incorporating the technology? Adoption faces headwinds from what we call the "PUE loophole”—an often-overlooked weakness within the widely accepted PUE metric.

The PUE Loophole

The PUE metric is the standard tool for assessing data center energy efficiency, calculated by dividing the total facility power consumption by the power utilized by IT equipment. The metric helps shape data center operations and guides efforts to reduce energy consumption, operational costs, and environmental impact.

Data center operators continuously strive to monitor and improve the PUE to indicate reduced energy consumption, carbon emissions, and associated costs. However, the PUE metric measures how efficiently power is delivered to servers—yet it omits power conversion efficiency within the server itself. As a result, the PUE calculation does not provide a comprehensive view of the energy efficiency within a data center—creating a blind spot for data center operators.

Consider that many servers still use AC/DC converters that are 94 percent efficient or less. While this may sound impressive—10 percent or more of all energy in a data center is lost. This not only increases costs and CO2 emissions but it also creates extra waste heat, putting additional demands on cooling systems.

GaN is remarkably effective in addressing the PUE Loophole. For instance, the latest generation of GaN-based server AC/DC converters are 96 percent efficient or better – which means that more than 50 percent of the wasted energy can instead be used effectively. Across the entire industry, this could translate into more than 37 billion kilowatt-hours saved every year—enough to run 40 hyperscale data centers!

GaN can provide an immediately cost-effective way to close the PUE loophole and save high amounts of energy. But because the PUE doesn’t consider AC/DC conversion efficiency in the server, there is no incentive to make AC/DC converters more efficient.

A Sustainable Path for Data Centers and Generative AI

As the era of generative AI ushers in new applications, the imperative to meet energy demands without compromising sustainability is now paramount. According to Harvard Business Review, “While observers have marveled at the abilities of new generative AI tools such as ChatGPT, BERT, LaMDA, GPT-3, DALL-E-2, MidJourney, and Stable Diffusion, the hidden environmental costs and impact of these models are often overlooked.

The development and use of these systems have been hugely energy-intensive, and maintaining their physical infrastructure entails power consumption. Right now, these tools are just beginning to gain mainstream traction, but it's reasonable to think that these costs are poised to grow — and dramatically so — soon.” According to Gartner, “The Generative AI frenzy shows no signs of abating.”

HBR and other experts cite the severe need for both technology suppliers and their users to facilitate change to make AI algorithms greener for broad deployment without harming the environment. GaN, with its unprecedented efficiency and performance, is a clear path forward for data centers. Only GaN can simultaneously boost efficiency and density to the levels demanded by Generative AI and the next generation of IT.

By enabling energy conservation, reducing cooling requirements, and enhancing cost-effectiveness, GaN is reshaping the data center power landscape. The interplay between generative AI and GaN presents an exciting opportunity to shape a more efficient, sustainable, and robust future for data centers.

Paul Wiener is Vice President Strategic Marketing at GaN Systems.

Related articles: