AI

Synaptics, Google team up on Edge AI for IoT

As 2025 dawns, there’s a growing sense that it is time for AI to prove itself in the real world. Enough with the hype, with AI proof-of-concept projects and wonky discussions about AI training. Let’s start seeing AI put to work in edge devices like vehicles, wearables, and smart appliances.

It’s a good bet the CES 2025 will be a showcase for such advancements, but Synaptics and Google decided not to wait, kicking off 2025 with the announcement of a collaboration that should have major implications for advancing multi-modal AI processing in wearables, appliances, entertainment devices, embedded hubs, monitoring applications, and control functions across consumer, automotive, enterprise, and industrial systems.

The partnership will involve integrating Synaptics’ Astra Edge IoT processors with Google’s open-source machine learning (ML) core, which is compliant with the multi-level intermediate representation (MLIR) compiler software infrastructure with. The companies said the combination will accelerate the development of AI devices for the IoT that support the processing of vision, image, voice, sound, and other modalities that provide context for seamless interactivity in edge AI devices and applications. This aligns with Synaptics’ previous work in developing multi-modal sensing capabilities for IoT devices and applications to collect the data to be processed by AI.

Back in November, during Synaptics' fiscal first quarter 2025 earnings call, CEO Michael Hurlston noted that the company grew its Core IoT revenue by 55% year over year, and said the Astra platform was poised to help Synaptics capture Edge IoT and AI opportunities in 2025.

He said at the time, “Our goal is probably first quarter, second quarter next year, to really finish the exploration process [for market opportunities] and narrow down on two or three segments and really go after it. So we feel good about where we are… A lot of opportunity in this area.”

News of the collaboration with Google also comes a couple of weeks after Synaptics said it would be demonstrating its Astra IoT platform during CES, which is scheduled for Jan. 7-10 in Las Vegas.

“Synaptics’ embrace of open software and tools and proven AI hardware makes the Astra portfolio a natural fit for our ML core as we ramp to meet the uniquely challenging power, performance, cost, and space requirements of Edge AI devices,” said Billy Rutledge, Director of Systems Research in Google Research, in a statement. “We look forward to working together to bring our capabilities to the broad market.”

Vikram Gupta, Senior Vice President and General Manager of IoT Processors, Chief Product Officer at Synaptics, added, “We are on the brink of a transformative era in Edge AI devices, where innovation in hardware and software is unlocking context-aware computing experiences that redefine user engagement. Our partnership with Google reflects a shared vision to leverage open frameworks as a catalyst for disruption in the Edge IoT space. This collaboration underscores our commitment to delivering exceptional experiences while validating Synaptics’ silicon strategy and roadmap for next-generation device deployment.”