AI compute demands on data centers have catapulted to so much that hyperscalers want to use nuclear power to create electricity. While expanding the available supply of electricity is their big worry, semiconductor researchers have ramped up energy-efficient chip materials and designs.
Semiconductor engineers admit many existing chip designs only incrementally reduce electricity used by AI like ChatGPT given the overwhelming size of the electricity need, but they argue even a small improvement is meaningful.
In one example, Onsemi says its power-efficient semiconductors can reduce data center energy consumption by 1%. The company attains that 1% reduction through use of its silicon carbide in its chips and medium voltage MOSFETs, according to Aditya Jain, director of the multi-market power division for the power solutions group at Onsemi. A metal-oxide-semiconductor field-effect transistor is used to control the flow of electricity in circuit.
Jain claimed Onsemi is the only chip provider with its currently available medium voltage MOSFET technology, which places it one year ahead of Infinion in doing so. He spoke in an interview with Fierce Electronics. Also of note, AMD’s MI300 accelerator takes advantage of vertical power stages to make its boards for data center racks more power efficient largely because the PCBs have shorter traces and less electricity resistance than through traditional approaches, he added. Hyperscalers are broadly requiring silicon carbide technology in converting voltages down to the voltage an individual GPU requires, usually at eight-tenths of a volt of electricity.
“It’s not like we make these products so efficient that you don’t need nuclear power for data centers,” he said. “We are making incremental improvements” with chip designs and materials. “You still need giant amounts of electricity to power all the infrastructure. These changes are not avoiding a nuclear future.”
RELATED: All the cloud hyperscalers have gone nuclear—Now the PR battle begins
Even so, Onsemi believes a 97.5% efficient chip with a greater than 1% efficiency boost is a big deal when compared to the industry average. If global data center power consumption doubles to more than 1,000 TWh in 2026 as predicted by the International Energy Agency, then a 1% reduction would yield a 10TWh savings in electricity, enough to power 926,000 homes for a year.
In slides shared with Fierce, Onsemi said its AC/DC Power Supply Unit at 650 volts is 97.5% efficient, an improvement of 1.5 % efficiency over prior generations. Separately, its eFuse and IBC (Intermediate Bus Converter) T10 has moved to 98% efficiency, up from 97%. (Onsemi’s eFuses, or Electronic Fuses, are IC protection devices, which are used in the industry to limit current and voltage to protect again overcurrent, overvoltage, over-temperature, reverse polarity, short circuit faults and more.) Onsemi has taken steps to describe its power semiconductor prowess, including with what it calls an industry-leading vertically integrated process for creating MOSFETs from silicon carbide granules, and has posted a video online describing its process.
Some have criticized Nvidia for its latest Grace Blackwell chips that each use up to 1200 watts of power, but Jain said he is not in that group and sees the coming energy demand for chips from Nvidia and others realistically. Nvidia CEO Jensen Huang has argued that the productivity improvements attained with AI will make businesses more energy efficient overall, justifying the energy consumption of its GPUs when a customer looks at TCO.
“We are not questioning what AI is doing with productivity. We’re taking it as it is and not questioning it. If that global energy consumption forecast is true and if all the data centers become real, they will not be as efficient, but we can still save electricity,” Jain said.
The White House held a roundtable in mid-September with datacenter operators, AI companies and hyperscalers to consider strategies for meeting future clean energy needs, but the focus was not primarily on chip energy efficiency and, instead, focused on the requirements for developing large AI datacenters and power infrastructure. Huang from Nvidia participated, as did leaders from Alphabet, Amazon and Microsoft. Meanwhile, cloud providers are developing their own AI accelerator hardware separate from Nvidia’s approach.
The Biden administration has recognized the value of producing semiconductors and other components with new materials such as silicon carbide the same material used in some of Onsemi’s products. A recent preliminary CHIPS Act award of $750 million went to Wolfspeed to construct a new silicon carbide wafer manufacturing facility in Siler City, North Carolina, and a planned expansion of Wolfspeed’s silicon carbide device manufacturing facility in Marcy, New York. Wolfspeed is considered the world’s leading manufacturer of wafers and devices made from silicon carbide, which is considered more energy efficient and durable than materials made from traditional silicon. Wolfspeed makes 200mm silicon carbide wafers.
The focus of the White House and Commerce Department is clearly toward making targeted investments with the CHIPS Act funds up and down the semiconductor supply chain to secure US technological leadership, a goal that sometimes includes a focus on energy efficiency for future chips.
On Friday, the Biden administration announced a funding opportunity under the CHIPS Act to make $1.6 billion available in multiple awards across five semiconductor advanced packaging R&D areas. One focus of advanced packaging is on improving chip performance while lowering cost and power consumption. A meeting on the funding opportunity is Oct. 22.
A day earlier, Commerce officials announced a proposed $93 million CHIPS Act investment in Infinera for a new fab in San Jose and an advanced test and packaging facility in Bethlehem, Penn. , to increase the company’s domestic manufacturing of indium phosphate-based photonic integrated circuits. In an official statement, officials highlighted the need for energy efficiency: “As the United States becomes more reliant on larger amounts of data driving increased energy usage, Infinera’s indium phosphide-based photonic integrated circuits (InP PICs) are increasingly important, using light to transfer information with greater energy efficiency.”
So, there you have it. Chip energy efficiency is on the minds of industry and government, even if more electricity generation is of paramount concern.