The day before Microsoft's Build 2024 event, Windows on Arm seems to have finally had its day in the sun. As Microsoft unveiled their new Copilot+ PC brand for the AI PC, Qualcomm and its Snapdragon X Series line of AI PC processors stood peerless. The unveiling ushered in what Microsoft CEO Satya Nadella claimed was the next era of PC computing that will “turn the World itself into a prompt.”
The pivotal nature of this moment can be traced back to March of 2020 when Qualcomm acquired Nuvia, a highly touted chip design house working on Arm-based chips for the server market. While Apple's M Series line of processors was shaking up the personal computing world with industry-leading performance per watt across the Mac portfolio, Qualcomm feverishly worked to adapt and optimize Nuvia’s unique CPU core IP and chip engineering for the PC market.
In the Fall of 2022, Qualcomm announced its Oryon CPU core at Snapdragon Summit. No chip yet and ChatGPT had not made its unexpectedly sensational debut, but the stage was set for the Snapdragon X Series family of AI PC processors. Snapdragon X Elite was announced last Fall with its all-performance 12-CPU core design. In April, Qualcomm added the 10-core Snapdragon X Plus with matching 45 TOPS (INT8) of NPU compute which currently leads the industry including Apple's recently launched M4 processor which clocks in at 38 TOPS.
Why is the Copilot+ PC announcement important for Qualcomm and Microsoft?
First, it is a huge and highly visible endorsement of Qualcomm by Microsoft as a strategic partner for Windows computing at large. In prior years, it appeared as if Microsoft was deprioritizing Windows on Arm and ramping down support. Based on Satya's keynote, this is not the case --if it ever was.
Qualcomm has benefited from great timing with its Snapdragon X Series processors that have caught the x86 world flat-footed. The timely arrival of the Oryon core and a big bet on industry-leading NPU compute packaged into a PC processor provides Microsoft with the AI compute headroom it needed to realize its on-device Copilot ambitions.
While Intel will assert that its Core Ultra processors are the first of the AI PC era, Microsoft's bar for NPU compute is set at 40 TOPS. This is the minimum specification required to run Windows Copilot on device with what Microsoft considers acceptable performance. The Core Ultra delivers around 11 TOPS. AMD with its latest Ryzen 7040 and 8040 laptop and desktop processors is also finding itself left out of the Copilot+ PC party for the moment with less than half the minimally required NPU compute.
OEM support of the Snapdragon X Series has been resounding with Dell, Lenovo, Hewlett Packard, Samsung, Asus, and Microsoft collectively announcing 20 models running the only Copilot+ PC-qualifying processor to hit the shelves on June 18th. Dell plans to release five Snapdragon X Series models that were prominently featured at Dell Technology World in Las Vegas.
Why will the AI PC and Copilot+ PCs matter?
The biggest question that has dogged the AI PC has been why it matters.
For the user of the AI PC, that answer largely boils down to privacy and confidentiality. A model deployed entirely on device could potentially ally the concerns and risks to individual privacy and the enterprise confidentiality, provided protections are designed into it and the Copilots that interface with it.
There are also the benefits of offline and local access to a generative AI model as well as comparatively improved responsiveness depending on the task. These factors will undoubtedly be a boon for creators such as graphical artists who use generative AI tools in their workflows. Generative design tends to involve many iterations of prompting and refinement that could become costly using a GenAI cloud service.
For laptop users, the NPU provides power-efficient AI compute, enabling persistent on-device generative AI inference with lower impact on battery life versus a GPU.
With the high cost of GenAI platform services a growing concern among developers, the AI PC makes it possible for Copilots and the models that serve them to be deployed locally on power-efficient NPUs. This approach helps developers avoid the variable and potentially high cost of procuring GenAI platform services from providers such as OpenAI, Google, or Anthropic to enable their Copilot applications. Provided sufficient compute is available on device to deliver a quality experience and outcome, the developer can avoid passing on usage costs or imposing usage caps on their customers.
The AI PC also works out for Microsoft who will be able to offload high cost GenAI inference services and associated workloads onto SLMs (Small Language Models) and small multimodal models such as Phi-3-vision deployed on device. More NPU compute on device means more options for Microsoft and developers to explore hybrid AI architectures and deployments.
Say hello to heavy client AI computing.
The Snapdragon X Series AI PC edge
Qualcomm's legacy in smartphone compute probably has something to do with their seemingly serendipitous pole position in the AI PC race. For over a decade, Qualcomm has been loading up their Hexagon NPU in their SoC designs and developing AI software frameworks to enable a bevy of on-device AI applications that have characterized mobile computing on the smartphone.
We have seen Qualcomm bring their mobile AI computing philosophy and engineering to their Snapdragon 8cx series of Windows on Arm processors that topped out with their Gen 3 model loaded with an impressive 29 TOPS of pre-ChatGPT NPU compute. This was a significant bump up in TOPs from the previous generation of Snapdragon 8cx PC processors.
In contrast, the x86 camp has only started their journey with the NPU. AMD introduced the first x86 processor with an NPU in mid 2023 in the Ryzen Pro 7040 series of mobile processors. Intel threw its hat into the AI PC ring with Core Ultra on devices shipping at the end of last year.
Much like Apple, Qualcomm has deep experience in leading edge SOC (System on a Chip) engineering for energy-efficient performance essential for highly constrained devices such as the smartphone. Qualcomm has demonstrated that Snapdragon 8 Gen 3 processor can run a 7 billion parameter Llama 2 LLM (Large Language Model) and a 1 billion+ parameter Stable Diffusion model rendering images in less than 1-second. Based on demos that Qualcomm ran at Snapdragon Summit last year and Mobile World Congress 2024 in Barcelona, the Snapdragon X Elite produces similar results and then some.
On the other hand, the x86 PC world has been moving toward chiplets or what Intel calls a tile architecture capitalizing on advanced packaging and integration technologies to scale their chip designs and economics.
It will be interesting to see how these contending chip design and architectural approaches compete for mindshare and market share going forward as the AI PC quickly evolves over the next couple of years.
This is just the beginning of the AI PC race
A great deal is at stake for the x86 camp especially Intel as Qualcomm makes good on the first Arm-based PC processor in generations to present a real threat to PC incumbency.
For the moment, Qualcomm can bask in the glory of Microsoft's spotlight, but Intel is slated to introduce its Lunar Lake processor this fall. It will join the currently exclusive Copilot+ PC club toward the end of 2024 with over 50 AI PC models powered by more than 100 platform TOPS and 45 NPU TOPS. No doubt, AMD has a Copilot+ PC-compliant offering in the works.
The real test of Qualcomm's Snapdragon X Series family of AI PC processors will come when the first batch of 20 Copilot+ PC devices hit the shelves and get in the hands of customers and benchmarking specialists who are eagerly waiting to independently find out how Qualcomm's first Oryon-powered processors stack up to the competition.
From the looks of it, Qualcomm seems to have finally delivered the Arm processor chip that Microsoft always wanted at just the right time to usher in the era of Copilot+ PC.
Leonard Lee is founder and executive analyst at neXt Curve, a research advisory firm focused on Information and Communication industry and technology research. Follow Leonard on LinkedIn: linkedin.com/in/leonard-lee-nextcurve