Intel's AI Everywhere strategy shows it is leading the charge: Gold

Jack Gold

Intel recently held an impressive event in New York about its prowess in AI across many of its business units – from end point computing all the way up to data center, cloud and hyperscaler infrastructure. This broad-based approach to AI is probably one of the most important strategy statements that Intel has made in a long time, as the market for AI is expanding dramatically. With an across-the-compute spectrum presence, Intel is showing its inherent strength compared to companies offering solutions to only one aspect of the AI processing continuum.

There’s a growing intersection between AI and general compute which Intel can, and is, prepared to exploit. Indeed, they are in the strongest position of any processor vendor to be able to do so. Here are some key market segments they will be addressing.

AI on the PC

With Intel committed to shipping over 100 million processors for the PC in the next few years, it's expected to power AI in 80% of all PCs. Further, Microsoft has committed to adding a number of AI-powered features to its next generation of operating systems. This is an inflection point for new ways to interact and new opportunities for advanced functions, with many new companies emerging. Just as we went from CPU alone, to integrated GPU on chip, nearly all processors going forward will include a NPU AI Accelerator built in. It's the new battleground and enabler for advanced functions that will change many aspects of software apps.

Looking at the new Intel Core Ultra AI capabilities, it's not just about the hardware. With an expansive ecosystem of partners and open source tools, Intel is showing its strength in moving the market. Further, citing how they are better than the competition, Intel is positioning itself as the premium processor for both performance and power efficiency. There is definitely going to be a battle with AMD to power next-gen devices, and Intel takes the lead. Core Ultra is available now in consumer PCs, and next quarter in Commercial PCs; the support that Intel is getting for its Core Ultra from virtually every PC maker is impressive.

Edge, data centers and cloud

The AI focus for Intel is not just on the PC. It’s also focused on the latest generation of Xeon processors for data center and cloud. This is a critical area for Intel as it couples its high-end compute solutions with its AI acceleration technology, and offers a range of products for power-efficient edge compute all the way up to massive cloud data center processors. It’s important to remember that many current and future workloads will run on integrated processors. With many modest AI model training needs, and especially for inference, Xeon is a strong alternative to special purpose high-cost Nvidia chips.

Even as major hyperscalers build their own custom chips specific to their needs, the majority of cloud workloads still run on Intel processors. Next-gen Xeon will continue this trend and I expect a huge number of Xeons to end up in hyperscaler datacenters. The new Intel Xeon chips continue to show how Intel is advancing the general purpose processor to enhance the latest workloads for AI, and powering an impressive ecosystem in cloud and on prem.

Competing against Nvidia

Not always recognized, Intel has a competitive system for AI model Training with its special purpose Gaudi chips. Using its open ecosystem approach that supports many open source and custom platforms and tools (e.g., OneAPI, Pytorch, TensorFlow, OpenAI, LLaMA, etc.), it offers a cost advantage for many workloads offering an alternative to high-end Nvidia GPUs. Gaudi is making inroads in many cloud instances for both training and inference.

At the New York event, Intel announced the new Gaudi 3 chip that had just came out of its fab, demonstrating the major advance that Intel is creating for AI workloads. While currently a standalone processor, we expect that the Gaudi functionality will be integrated first into Intel’s GPU Max in the next 1-2 years followed shortly thereafter by integration into the Xeon, just as GPUs were integrated many years ago. That will give general purpose computing processors some very high end model training and inference capabilities, even edging out some AI-specific processors for many workloads.

Bottom Line: One of the advantages Intel has in AI is breadth from endpoint devices through edge, cloud and HPC data center, but also commitment to open software models. AI is a vast market, with much larger potential for inference than training, where Intel has an advantage with its AI accelerated general purpose processors. Focus on only high-end model training misses a major part of the market, as AI moves to more personal devices. Integrated AI capability via an NPU across its total range of processors, and especially in the PC, is probably the biggest change since GPUs were integrated into PC processors, and it will usher in the same advancement of applications that the GPU brought to functions, only probably more so. Within 2-3 years, having a PC without AI will be a major disadvantage. Intel is leading the charge.

Jack Gold is founder and principal analyst at J.Gold Associates, LLC. With more than 45 years of experience in the computer and electronics industries, and as an industry analyst for more than 25 years, he covers the many aspects of business and consumer computing and emerging technologies. Follow him on Twitter @jckgld or LinkedIn at https://www.linkedin.com/in/jckgld.