Arm on Monday announced 11 partners in the Internet of Things industry who plan to test or use Arm’s new machine learning (ML) processor and an industry first neural processing unit.
Arm announced the Cortex-M55 ML processor and the Arm Ethos-U55 neural processing unit (NPU) for Cortex-M along with a slate of early support from Amazon, Google, Dolby, NXP, STMicroelectronics and others.
The two innovations are designed to help device and chip makers bring more artificial intelligence (AI) computing power to tiny edge devices and in many cases eliminate the network trip to the cloud for computations, according to statements from the partners and Arm.
“No device is left behind as on-device ML on the tiniest devices will be the new normal, unleashing the potential of AI securely across a vast range of life-changing applications,” said Dipti Vachani, service vice president and general manager of Arm’s auto and IoT business, in a statement.
More on-device intelligence means smaller, less costly devices that can be smarter with less reliance on the cloud or Internet, Arm said. The combination of IoT with AI advances and the rollout of 5G makes the new Arm microcontrollers timely. Arm claimed it can reduce silicon and development costs and speed up the time to market for product makers building devices that enhance digital signal processing and ML right on a device.
The new microcontrollers will support an existing installed base of 50 billion Cortex-M chips. The Cortex-M55 will deliver 15 times faster ML performance and five times faster digital signal processing (DSP) performance, Arm said. The Ethos-U55 is designed to speed up ML interference functions in tiny IoT devices. Arm claimed the two devices together will boost ML performance 480 times. Arm posted product specs online for the Cortex-M55 and the Ethos-U55.
In one partner statement, a Dolby executive suggested the Cortex-M55 would make a difference in small devices like smart speakers because of its higher DSP performance and power efficiency, which can be used to bring more Dolby Atmos to products.
A Google executive said the new Arm processors will be used for ML at the endpoint. With the technology endpoint devices “can run neural network models on batteries for years and deliver low-latency inference directly on the device,” said Ian Nappier, product manager for TensorFlow Lite at Google.
Other supporters of the new technology announced by Arm included Qeexo, Shoreline IoT, Cypress, Alif Semiconductor, Au-Zone and Bestechnic.
In a blog, Vachani said the use cases for the new microcontrollers include voice and gesture machine interaction and predictive failure sensors systems. In one example he described how a connected walking stick for the blind or partially-sighted could benefit by AI-enabled visual sensing with a 360-degree camera to replace reliance on ultrasound positioning.
Vachani also said that an Arm survey showed that up to 75% of consumers prefer their data to be processed on-device with only crucial data sent to the cloud. “By elevating the AI capability of devices so they don’t have to rely on cloud compute, we enable a richer framework for trust,” he said in the blog. “Sensitive data can be controlled between the compute layers.”
An Arm spokeswoman said the Cortex-M55 is now available to silicon partners, and the Ethos-U55 will be released in June.