(Reuters) – Major tech firms, including Meta, Microsoft, Advanced Micro Devices and Broadcom, said on Thursday they have developed a new industry standard for networking in AI data centers, the latest effort to break the dominance of market leader Nvidia.
The “Ultra Accelerator Link,” is an attempt to establish an open standard for communication between artificial intelligence accelerators – systems that can help process the vast amounts of data employed in AI tasks.
Other members of the grouping include Alphabet-owned Google, Cisco System, Hewlett Packard Enterprise and Intel.
WHY IT’S IMPORTANT
Nvidia, the biggest player in the AI chip market with a share of around 80%, is not part of the grouping.
Tech giants like Google and Meta are keen to reduce their dependence on Nvidia, whose networking business forms an essential part of the package that enables its AI dominance.
Broadcom’s central rival in the networking and custom chip market – Marvell Technologies, is not part of the grouping.
KEY QUOTE
“An industry specification becomes critical to standardize the interface for AI and Machine Learning, HPC (high-performance computing), and Cloud applications for the next generation of AI data centers and implementations,” the companies said in a statement.
CONTEXT
Tech companies are pouring billions of dollars into the hardware required to support AI applications, boosting demand for AI data centers and the chips that they run on.
The Ultra Accelerator Link group has designed specifications governing connections among different accelerators in a data center.
The specifications will be available in the third quarter of 2024 to companies that join the Ultra Accelerator Link (UALink) Consortium.
THE RESPONSE
Nvidia and Marvell did not immediately respond to a Reuters request for comment.
(Reporting by Arsheeya Bajwa in Bengaluru; Editing by Tasim Zahid)