NVIDIA and Arm are partnering to bring deep learning inferencing to the billions of mobile, consumer electronics and Internet of Things (IoT) devices. The two entities will integrate the open-source NVIDIA Deep Learning Accelerator (NVDLA) architecture into Arm’s Project Trillium platform for machine learning. Supposedly, the collaboration will make it simple for IoT chip companies to integrate AI into their designs and help put intelligent, affordable products into the hands of billions of consumers worldwide.
Based on NVIDIA Xavier, NVDLA is a free, open architecture to promote a standard way to design deep learning inference accelerators. NVDLA’s modular architecture is scalable, configurable, and designed to simplify integration and portability. It brings a host of benefits that speed the adoption of deep learning inference. It is supported by NVIDIA’s suite of powerful developer tools, including upcoming versions of Tensor, a programmable deep learning accelerator. The open-source design allows for cutting-edge features to be added regularly, including contributions from the research community.