Nvidia wins overall in GPU benchmark for AI work

Nvidia said its A100 GPUs won all the MLPerf benchmark tests for AI inference recently and said a wide variety of companies are using its products even as most companies worry about financial returns. (Nvidia)

 

Nvidia again on Wednesday touted its prowess in AI inference performance in data center and edge computing.

The GPU maker won all the tests in the latest round of MLPerf benchmarks.  It noted that eight A100 GPUs in a single DGX A100 system can provide the same compute performance of nearly 1,000 dual-socket CPU servers on some apps.

Free Daily Newsletter

Interesting read? Subscribe to FierceElectronics!

The electronics industry remains in flux as constant innovation fuels market trends. FierceElectronics subscribers rely on our suite of newsletters as their must-read source for the latest news, developments and predictions impacting their world. Sign up today to get electronics news and updates delivered to your inbox and read on the go.

Nvidia GPUs are used in systems from a wide range of server providers including Cisco, Dell EMC, Fujitsu, and Lenovo.  Nvidia noted in a blog by Paresh Kharya, senior director of product management and marketing at Nvidia, that MLPerf benchmarks are relied upon by Arm, Facebook, Google, Intel, Lenovo and Microsoft.

 In its blog, Nvidia said AI breakthroughs are having a profound impact on natural language processing, medical imaging and recommendation systems. The company’s GPUs are being used in auto, robotics, retail, manufacturing and financial services by companies such as American Express, BMW, Capital One, Dominos, Ford, Kroger, and Toyota.

Separately on Wednesday, Synopsis announced a collaboration with IBM Research’s AI Hardware Center to advance AI compute performance by 1,000 times in the coming decade, more than an annual doubling of AI compute performance.  

Nvidia’s promotion of its GPUs for AI inference and other industry efforts to improve AI compute performance stand in stark contrast to a recent finding that just 11% of companies say they have seen a significant financial return on investment from AI deployments.  The finding was based on a survey of 3,000 managers globally and interviews conducted by Boston Consulting Group in partnership with MIT Sloan Management Review.

RELATED: Just 11% of companies using AI reap significant financial returns, study finds

“Compute performance is important, but not as important as how you’ve trained your AI program and how well you’ve defined the AI parameters,” said Jack Gold, an independent analyst at J. Gold Associates, in an email to Fierce Electronics.

When it comes to AI, Gold said, “what matters is how good your algorithms are, and more importantly, how good your learning data is…AI benchmarks are troublesome in that there are many and they may not apply that closely to what your AI process is actually doing. I take all benchmarks with a large grain of salt.”

Suggested Articles

In June, Su stood out for heartfelt commentary on social injustice after the killing of George Floyd. She challenges industry to do more.

New chip will be incorporated in coffee mug-sized device to cost $199 next year

In a nutshell, Edge ML is a technique by which smart devices process data locally using machine and deep learning algorithms.