Skip to main content

Broadcom unveiled its new Thor Ultra networking chip on Tuesday, expanding its push into the AI data center market and intensifying competition with Nvidia. The chip connects large clusters of AI processors, allowing cloud operators to train and run massive AI models like ChatGPT more efficiently.

The Thor Ultra doubles the bandwidth of its predecessor and plays a critical role in moving data within distributed computing systems. “In the distributed computing system, the network plays an extremely important role in building these large clusters,” said Ram Velaga, Broadcom’s senior vice president.

The launch comes one day after Broadcom announced a 10-gigawatt chip deal with OpenAI for 2026, positioning itself as a major force in both networking and custom AI chips. CEO Hock Tan said the company’s AI-related markets could reach $90 billion by 2027, split between networking products and processors co-developed with partners like Google and OpenAI.

Broadcom’s Thor Ultra continues its design philosophy of creating modular, high-performance chips for the data center ecosystem. Engineers at its San Jose labs tested the chip extensively, focusing on heat management, power efficiency, and scalability.

Broadcom does not sell complete servers but offers system-level designs to its partners to guide data center infrastructure planning. “For every dollar we invest in our silicon, there is at least $6 to $10 invested by our ecosystem partners,” Velaga said.

The new chip cements Broadcom’s growing presence in AI networking, a segment increasingly vital as global data centers race to support the surge in AI workloads.