Nvidia unveils Orin chip for robots and AVs

Nvidia's new Orin chip will power a Drive AGX Orin platform for autonomous vehicle and robot training. It has 17 billion transistors. (Nvidia)

Nvidia announced on Wednesday a software-defined platform for autonomous vehicles and robots powered by a new chip called Orin.

The Orin system-on-chip (SoC) has 17 billion transistors and relies on Nvidia next-gen GPU architecture and Arm Hercules CPU cores. Coupled with new deep learning and computer vision accelerators, Orin delivers 200 trillion operations per second, which is seven times the performance the previous generation Xavier SoC.

The software platform that Orin powers is called Nvidia Drive AGX Orin. It is designed to handle deep neural networks running robots and autonomous vehicles. Both Orin and Xavier are programmable through CUDA and TensorRT APIs, which gives programmers forward compatibility in their applications.

Sponsored by Digi-Key

TE Connectivity Horticultural Lighting Solutions Available Now from Digi-Key

TE connector, relay and filter solutions can help create a fine maze network of power-cabling to the lights needed for vertical farming. Their solutions are interchangeable, easy-to-install, and will last for years in variable humidity environments.

Nvidia recognizes the complexity and expense of building safe autonomous vehicles (AVs). Nvidia CEO Jensen Huang called the task “perhaps society’s greatest computing challenge.” The Orin family of products will be available to carmakers for their 2022 production timelines.

Virtually every company working on AVs is using Nvidia technology, according to Sam Abuelsamid, an analyst at Navigant Research.

Nvidia also announced that it will provide vehicle makers and researchers with access to Nvidia Drive deep neural networks (DNNs) for AV development on the Nvidia GPU Cloud container registry.

Also, Nvidia announced availability of software tools for developers to customize DNNs while using their own datasets. The tools are for training DNNs for active learning, federated learning and transfer learning, which allows developers to speed development of perception software by leveraging Nvidia AV development.

In addition, Nvidia and Didi Chuxing announced they are working together to develop AV and cloud computing solutions.

Didi, maker of a leading mobile transport platform, will use Nvidia GPUs in data center servers to train machine learning algorithms as well as Nvidia Drive for inference work on its Level 4 AVs. Drive enables data to be fused from various sensors, such as cameras, LiDAR, radar and more and then uses DNNs to comprehend the 360-degree environment around a vehicle.

RELATED: Nvidia GPUs and Arm processors marry in new server reference design


Suggested Articles

Eta Compute Inc. has begun shipping its ECM3532 multicore processor for embedded sensor applications.

According to a new study by Fact.MR, the MEMS sensor market will grow at a compound annual growth rate of 9% through 2027, reaching $50 billion.

Existing thermometers are not the most accurate means of measuring temperature to help design heating and cooling systems.