By Stephen Nellis
(Reuters) – Onsemi on Wednesday unveiled a lineup of chips designed to make the data centers that power artificial intelligence services more energy efficient by borrowing a technology it already sells for electric vehicles.
Onsemi is one of a handful of suppliers of chips made of silicon carbide, an alternative to standard silicon that is more pricey to manufacture but more efficient at converting power from one form to another. In recent years, silicon carbide has found wide use in electric vehicles, where swapping out the chips between the vehicle’s battery and motors can give cars a boost in range.
Simon Keeton, president of the power solutions group at Onsemi, said that in a typical data center, electricity gets converted at least four times between when it enters the building and when it is ultimately used by a chip to do work. Over the course of those conversions, about 12% of the electricity is lost as heat, Keeton said.
“The companies that are actually using these things – the Amazons and the Googles and the Microsoft – they get double penalized for these losses,” Keeton said. “Number one, they’re paying for the electricity that gets lost as heat. And then because it gets lost as heat, they’re paying for the electricity to then cool” the data center, Keeton said.
Onsemi believes it can reduce those power losses by a full percentage point. While a percentage point does not sound like much, the estimates of how much power AI data centers will consume is staggering, with some groups estimating up to 1,000 terawatt hours in less than two years.
One percent of that total, Keeton said, “is enough to power a million houses for a year. So that puts it into context of how to think about the power levels.”
(Reporting by Stephen Nellis in San Francisco; Editing by Chris Reese)