NXP Semiconductor announced a deep-learning toolkit on Tuesday for auto application designers called eIQ Auto. It enables designers to use machine learning algorithms on embedded NXP microcontrollers for use in building future generations of connected and autonomous vehicles.
The eIQ Auto application programming interface (API) is designed to help designers move more quickly from development to AI application implementation based on strict automotive standards, the company said in a release.
A designer could use a desktop, cloud and GPU environment to deploy a neural network onto a supported S32 processor, which would be especially useful for deep-learning algorithms in vision-based systems. Such deep learning scenarios are deemed to offer better accuracy in object detection over commonplace computer vision algorithms.
Designers can lower their costs in selecting and programming embedded compute engines into each layer of a deep learning algorithm with the automated selection process in the API, NXP claimed. Doing so will result in 30 times higher performance compared to competitors, according to an NXP benchmark that relied on dual apex 2 on a S32V234 processor.
Kamal Khouri, vice president of advanced drivers assistance solutions at NXP said current autonomous test vehicle implementations are “bulky, power hungry and impractical for volume auto production.” The new eIQ Auto helps customers deploy neural networks in embedded processors “with the highest levels of safety and reliability,” he added.
NXP is innovating in automotive work in various ways, including use of Ultra-Wideband for car security, which will show up in VW models later in 2019.