Micron Technology was among the Nvidia partners making news at the company’s massive GTC Spring event, as it unveiled its 128 GB small outline compression attached memory module (SOCAMM) product, which was co-developed with Nvidia, and will be a key part of Nvidia’s upcoming GB300 Blackwell Ultra Superchip.
Power efficiency and size are among the key advancements here. Micron said in a statement that by leveraging LPDDR5X memory, SOCAMM products consume one-third the power compared to standard DDR5 Registered Dual In-line Memory Module (RDIMM), boosting AI architecture performance.
In terms of form factor, the SOCAMM is 14x90mm, which is one-third of the size of the industry-standard RDIMM form factor, and could lead to more compact server designs. It also delivers more than 2.5 times higher bandwidth at the same capacity when compared to RDIMMs, which Micron said allows faster access to larger training datasets and more complex models, as well as increased throughput for inference workloads. The latter benefit could not come at a better time for Nvidia, as much of the talk about the AI future at the GTC event centered around the AI market shift from training to reasoning and inference.
Nvidia’s Blackwell Ultra is due out in the second half of 2025, but with the SOCAMM launch, Micron claims to be the first memory supplier shipping both SOCAMM and HBM3E memory components. The Micron HBM3E 12H [12-High Stack] 36GB is also designed into the Nvidia HGX B300 NVL16 and GB300 NVL72 platforms, while the HBM3E 8H 24GB is available for the Nvidia HGX B200 and GB200 NVL72 platforms, the company said.
These design wins could be a big reason why Micron just reported second quarter 2025 revenue of just over $8 billion, beating estimates, and why it is forecasting around $8.8 billion in revenue for the current quarter.
“We anticipate HBM3E 12-High will comprise the vast majority of our HBM shipments in the second half of calendar 2025,” said Micron CEO and President Sanjay Mehrotra, on the company’s earnings call. “We are making good progress on additional platforms and customer qualifications with HBM.”
Regarding the new SOCAMM product, he added later on the call, “LPDRAM in a SOCAMM form factor enables easier server, manufacturability, and serviceability and helps drive broader LP adoption in the server market. We are on track to deliver multi-billion dollars in revenue in fiscal 2025 from our portfolio of high-capacity D5 modules and LP products for the data center.”
It also will not be long before HBM4 memory becomes a factor as HBM4 is expected to be a major feature of Nvidia’s Vera Rubin GPU launching next year. Nvidia has yet to announce its memory suppliers for that chip, but Micron competitor SK hynix recently accelerated its sampling of HBM4 chips to Nvidia, according to published reports.