AI

AMD already leveraging soon-to-be acquired ZT Systems

As 2024 comes to a close and AMD in the new year eyes ways to gain traction against Nvidia in the AI chip market, it can look forward to the closing of the ZT Systems acquisition it announced last summer.

Speaking at the UBS Annual Technology Conference this week, Forrest Norrod, executive vice president and general manager of the Data Center Solutions Business Group, said the acquisition remains on track to close in the first half of 2025.

“We're still very confident of closing the transaction in the first half of next year,” Norrod said. “We've already gotten regulatory approval in the U.S. and a number of other geos. We're waiting for a few, but that all looks like it's on track.”

After the deal closes, AMD plans to sell the ZT Systems manufacturing unit, which competes against many of AMD’s server partners, and counts Nvidia, among others, as a customer. That will leave AMD with the system integration and engineering side of ZT Systems, the reason why it initiated the $4.9 billion acquisition.

While regulatory approval and a divestment deal await in the months to come, Norrod suggested AMD is not waiting until then to leverage the benefits of the ZT Systems acquisition for its rising MI Instinct family of GPUs.

“We're still two different companies, so we can't operate as one yet, but we can put in place strong contractual agreements that allow us to engage resources on forward-looking products,” Norrod said. “We have already done so on 355 (MI355X), on the… 400 (MI400X)series, and quite candidly, beyond, so… you will see some contribution from the ZT System resources in the 350 series systems… but certainly see a major contribution from the ZT systems engineering teams on 400 and beyond.”

The MI355, which has been widely thought to be the first AMD GPU that will compete directly with Nvidia’s Blackwell GPU family in terms of performance, memory, model size, and data-type support, is scheduled to be ready in the second half of next year. The MI400X is expected to follow in 2026.

Norrod also further explained AMD’s rationale for acquiring a systems integrator with deep visibility into how customers deploy GPUs. “The first is that, as we're designing for these 200 kilowatt-plus rack systems, you really do need to comprehend the requirements at the rack and cluster level as you're designing silicon. Being able to do that system and cluster level design very early on allows you to define and design a better piece of silicon. The second part is we want to support the ecosystem by adding value to our solution, so we don't want to take the approach that we have a one size fits all. We're not trying to take the Henry Ford approach of ‘You can have your hyperscale data center rack any color you want as long as the color is black.’ We're investing in our systems engineering not only to produce a great set of basic designs and elements, but also to allow others in the ecosystem to do variations, to add their own value. That actually takes a little bit more engineering upfront to add the hooks, and design the components so that others can do that, but we think by doing so we better harness the engineering talent across industry to accrue value.”

That flexibility could be the key for AMD as it looks to chip away at Nvidia’s dominance in AI. It is believed the increasing need for inference, a strength of the current MI300X GPU, already has allowed AMD to achieve close to 10% market share. While Norrod declined to be specific about AMD’s near-term market share goals, he made clear AMD’s progress so far is validating, but not fully satisfying.

“Our intent is to continue leadership in inference across the board, which we think is more and more important as a chain-of-thought, models and similar approaches become more important, and to continue developing ourselves on training as well,” Norrod said. “Long term, we don't aspire to be an inference solution. We aspire to be a provider of great AI solutions on the training side, on the inference side, on any models of any size.”