AI

ZeroPoint announces new compression tech for foundational AI models

Sweden-based ZeroPoint Technologies AB announced a new compression technology to tackle the exploding need for greater memory efficiency in data centers that are focused on AI compute, including Large Language Models.

On Thursday, the startup announced AI-MX, a memory optimization product shipping in the second half of 2025.  The company claims AI-MX will expand effective memory capacity in hyperscalers and enterprise data centers by up to 50%.  That ability would allow an end user to store 150GB of model data in a 100GB of HBM capacity, potentially saving companies billions of dollars each year in large datacenters devoted to AI, according to Klas Moreau, CEO of ZeroPoint.

“We see enormous value in cost savings or added performance, even billions,” Moreau said in an interview with Fierce.

ZeroPoint, founded in 2017, has raised $15 million and $2.5 million in soft funds and now employs 30 professionals.  It currently operates as an IP licensing company but remains open to being absorbed by a big company in the space, Moreau said.

“With the challenge in memory with more computing that feeds it, why not use this memory resource? If we are absorbed by a big company, so be it, but as long as there is none, we will continue to launch,” Moreau said. He said the company has seen “significant interest” from potential customers in Asia.

ZeroPoint relies on proprietary hardware-accelerated compression, compaction and memory management tech to operate at low nanosecond latencies that work 1000 times faster than traditional compression algorithms, the company said. With a resulting 1.5x increase in capacity and bandwith of existing memory, performance per watt is increased. AI-MX works across memory types including HBM, LPDDR, GDDR and DDR.

For general uses not related to foundational models, ZeroPoint said it can increase general memory capacity by up to 4x, while offering 50% greater performance per watt, which together can reduce TCO for hyperscaler data centers by up to 25%. 

ZeroPoint released an AI-MX product sheet with its announcement along with the following slide:

chart