AI

Intel chief Gelsinger talks AI PCs, battling Nvidia, and more

Intel CEO Pat Gelsinger’s keynote speech at Intel Vision 2024 this week included an update on the company’s aggressive push into AI PCs, the unveiling of its latest weapons aimed at grabbing AI market share from Nvidia, and among other things a pledge to enable open enterprise AI environments.

Touting Intel’s latest progress in the AI PC market, Gelsinger told the audience at the Phoenix, Arizona, event that the company has shipped more than 5 million units so far, and is on track to ship at least 40 million by the end of this year, and 100 million next year. Gelsinger said Intel is driving the growth of a new enterprise segment my focusing on increasing volume on honing in on the platform evolution

“We standardized the Wi Fi for several years, and sort of went nowhere because there were few access points and so on, and then we put it into a volume platform that we drove into the industry, and all of a sudden every coffee shop, every hotel, every office became wirelessly-enabled to change the platform and change the use cases,” he said. “And this is exactly what we're doing with the AI PC. Intel is driving this new platform evolution, the AI PC, We just launched the commercial versions in February of Core Ultra [AI PC processors]. We're expanding our AI PC developer program… and of course working with our OEM partners. Our roadmap is strong, and we’re off to a good start.”

But while Intel takes a lot of credit for putting AI PCs in the spotlight, it is not resting on what it has done so far. Gelsinger also formally announced the company’s widely-anticipated second-generation processor for AI PCs, code-named Lunar Lake, which will be ready late this year and promises to offer more than 100 platform tera operations per second (TOPS) and more than 45 neural processing unit (NPU) TOPS for next-generation AI PCs. “While competitors are getting ready to launch their first [AI PC] chips, we are already launching our second,” Gelsinger said.

The CEO also unveiled the Gaudi 3 AI accelerator chip, which Gelsinger claimed will beat Nvidia’s H100 on AI performance and power consumption by delivering 50% on average better inference and 40% on average better power efficiency than the H100, and at a lower price point. (That claim appears to be based on Intel projections for the chip’s performance on some Llama2 and Falcon large language model parameters, compared to Nvidia’s stated performance for the H100.)

The Gaudi 3 will be available to OEMs, including Dell Technologies, Hewlett Packard Enterprise, Lenovo, and Supermicro, in the second quarter of 2024.

Gelsinger also said Intel is teaming up with other companies to create a more open generative AI ecosystem. A company statement elaborated:

“In collaboration with Anyscale, Articul8, DataStax, Domino, Hugging Face, KX Systems, MariaDB, MinIO, Qdrant, RedHat, Redis, SAP, VMware, Yellowbrick and Zilliz, Intel announced the intention to create an open platform for enterprise AI. The industry-wide effort aims to develop open, multivendor GenAI systems that deliver best-in-class ease-of-deployment, performance and value, enabled by retrieval-augmented generation. RAG enables enterprises’ vast, existing proprietary data sources running on standard cloud infrastructure to be augmented with open LLM capabilities, accelerating GenAI use in enterprises.

“As initial steps in this effort, Intel will release reference implementations for GenAI pipelines on secure Intel Xeon and Gaudi-based solutions, publish a technical conceptual framework, and continue to add infrastructure capacity in the Intel Tiber Developer Cloud for ecosystem development and validation of RAG and future pipelines. Intel encourages further participation of the ecosystem to join forces in this open effort to facilitate enterprise adoption, broaden solution coverage and accelerate business results.”