By Stephen Nellis
(Reuters) – Cerebras Systems, the Silicon Valley startup making the world’s largest computer chip, said on Tuesday it can now weave together almost 200 of the chips to drastically reduce the power consumed by artificial-intelligence work.
Cerebras is one of a number of startups making chips specifically designed for AI and aiming to challenge current market leaders Nvidia Corp and Alphabet Inc’s Google. The company has raised about $475 million in venture capital and has secured deals with pharmaceutical firms GlaxoSmithKline Plc and AstraZeneca Plc to use its chips to speed up drug discovery.
Traditionally, hundreds or even thousands of computer chips are manufactured on a 12-inch (30 cm) silicon disc called a wafer, which is later sliced up into individual chips. Cerebras, by contrast, uses the entire wafer. The huge Cerebras chip can hold more data at once.
But artificial intelligence researchers now have AI models called “neural networks” too big for any single chip to hold, so they must split them up across many chips. The biggest current neural networks are still only a fraction of the complexity of a human brain, but they use much more energy than human brains because the systems that run them become less power-efficient as more chips are added.
Cerebras said on Wednesday that it can put together 192 of its chips to train huge neural networks, but that the power efficiency will stay the same as chips are added. In other words, Cerebras can double the amount of computing its chips do for double the power, unlike current systems that need more than twice as much power to double their computing capacity.
Current AI systems “are in the realm where you’re talking about tens of megawatts of power, and you’re doing it over months. You’re using a the equivalent of a small city’s power to train these networks,” Cerebras Chief Executive Andrew Feldman told Reuters. “Power is extremely important.”
(Reporting by Stephen Nellis in San Francisco; Editing by Matthew Lewis)