Platform Enables Low-Latency AI At The Edge

NVIDIA unveils EGX, an accelerated computing platform that enables companies to perform low-latency artificial intelligence (AI) at the edge to perceive, understand, and act in real time on continuous streaming data between 5G base stations, warehouses, retail stores, factories and beyond. The platform addresses the growing demand to perform instantaneous, high-throughput AI at the edge where data is created, with guaranteed response times, while reducing the amount of data that must be sent to the cloud.

EGX starts with the NVIDIA Jetson Nano, which in a few watts can provide one-half trillion operations per second (TOPS) of processing. It spans to a full rack of NVIDIA T4 servers, delivering more than 10,000 TOPS for real-time speech recognition and other AI tasks.

NVIDIA has partnered with Red Hat to integrate and optimize Edge Stack with OpenShift, the leading enterprise-grade Kubernetes container orchestration platform. Edge Stack is optimized software that includes NVIDIA drivers, a CUDA Kubernetes plugin, a CUDA container runtime, CUDA-X libraries, and containerized AI frameworks and applications, including TensorRT, TensorRT Inference Server, and DeepStream. NVIDIA Edge Stack is optimized for certified servers and downloadable from the NVIDIA NGC registry. For more details, visit NVIDIA.