What happens when you invite 14 quantum computing executives to the same panel session to defend their companies’ lives and technology choices? You get a chaotic, sometimes redundant, but ultimately fairly illuminating three-part, two-hour argument for why quantum computing is so important, what "useful" means, and how classical and quantum computers will co-exist.
But before all that, let’s acknowledge the quantum bear in the room. That would be Nvidia CEO Jensen Huang, who sent the stock values of publicly-traded quantum computing companies reeling in January when he observed, in answer to a question in front of Wall Street analysts, that “very useful” quantum computers are still 20 years away.
As he took the stage for “Quantum Day” at Nvidia’s GTC event, Huang started by sheepishly and impishly walking back that comment, at least part way: “I'm a public company CEO and every so often, someone asks me a question, and most of the time, well, some of the time–I'm gonna try to lower the bar here–I say something right, and sometimes it comes out wrong,” he said. “Somebody asked me how long before a quantum computer will be useful and remember, this is from somebody who's built a computing platform, and to me, building Nvidia and building CUDA and turning it into the computing platform that it is today, it has taken us over 20 years. So, a time horizon of 5, 10, 15, 20 years is really nothing to me… And so the idea that it would take years to achieve was something that I would expect because of the complexity of it and the grand impact it could have. So when I said the answer, the next day I discovered that several companies’ stock, apparently the whole industry’s stock, went down 60%. And my first reaction was: I didn’t know they were public.”
It was a funny moment that dissolved much of the remaining concern anyone in quantum computing might still have about Nvidia’s commitment to the sector, and went a ways toward explaining why Nvidia made time at GTC amid talk of GPUs, AI, robotics, and digital twins, to hand over stage time to many (but not all, Huang was quick to clarify) of its quantum ecosystem partners.
The list of execs that shared their visions and defended their positions during the session included:
Alan Baratz, CEO, D-Wave
Ben Bloom, Founder and CEO, Atom Computing
John Levy, CEO and Co-Founder, SEEQC
Krysta Svore, Technical Fellow, Microsoft
Loïc Henriet, Chief Executive Officer, Pasqal
Matthew Kinsella, CEO, Infleqtion
Mikhail Lukin, Joshua and Beth Friedman University Professor, QuEra Computing
Pete Shadbolt, Co-Founder and Chief Scientific Officer, PsiQuantum
Peter Chapman, Executive Chair, IonQ
Rajeeb Hazra, President and CEO, Quantinuum
Rob Schoelkopf, Chief Scientist and Co-Founder, Quantum Circuits
Simone Severini, General Manager, Quantum Technologies, AWS
Subodh Kulkarni, CEO, Rigetti
Théau Peronnin, CEO and Co-Founder, Alice & Bob
That’s quite a list, and while some of the comments that followed might have seemed overly familiar to anyone who has followed quantum computing for even a little while, there were still interesting moments.
Among them, Huang confirmed that Nvidia’s role in the quantum computing industry is much the same as its role in the robotics and automotive industries. It doesn’t make robots, cars, or quantum computers, but is focused on enabling each of those industries to achieve more with a little boost from accelerated computing. “We don't build quantum computers, but we are deeply integrated into quantum computing. CUDA-Q is a programming model for hybrid, classical-accelerated quantum computing. We have cuQuantum libraries that help you simulate quantum circuits and DGX Quantum [a GPU-accelerated quantum computing system] to do error correction of quantum computers. We partner with them, we support them, we help them in any way we can.”
In that sense, it was clear that Huang does not see quantum computers as an existential threat to Nvidia, as some have suggested, but as an emerging group of partners/customers. It seems less likely than it ever has that Nvidia will complicate that relationship by designing its own quantum processors.
But, in adopting a more bullish attitude about quantum, Huang also laid out a series of big questions for the companies that actually do build quantum computers to answer: Should we reframe the definition of a quantum computer to be more an instrument of scientific discovery than how we think of traditional computers? Where does this evolution stand? How quickly can it evolve to a “useful” stage, however that might be interpreted?
Many of Huang’s guests honed in on the role of quantum computers–or more specifically quantum processors–as complimentary additions of classical computers, and how both should work in concert.
IonQ’s Chapman noted, “We use your GPUs to design our chips, to often do simulation to make sure that the quantum computers are working. When we look to the future for quantum computing. It's going to be a set of classical systems sitting right next to a quantum computer, and the two of them are going back and forth. And so it isn't something where one is replacing the other. They're working together…It is already a synergistic relationship between classical computing and quantum, and the strange thing is, our quantum computers are almost entirely classical. The only quantum part happens to be a little chip and a couple of atoms at the center.”
But D-Wave’s Baratz, arguably the most vocal rebutter of Huang’s original quantum comments, took issue with the framing of quantum computers as either purely scientific instruments or machines that are not already able to stand on their own to process some types of applications.
“I don't know how to think of a quantum computer as an instrument when it's being used for materials discovery, when it's being used for blockchain, when it's being used [by companies like] NTT Docomo to improve cell tower resource utilization,” Baratz said. “I mean, it's true that there are many applications I would never try to run on a quantum computer, but for applications that require extensive processing power, these machines are very powerful, and I think go well beyond just instrumentation or measurement.”
Huang responded, mock-defensively, “I was just trying to help.”
“We’ve seen your help,” was Baratz’s sharp, funny retort, clearly a reference to Huang’s January comments, but one that seemed to go unnoticed.
Chapman remained concilatory, saying, “It does take a long time to go from kind of a startup to where [Nvidia is] today. And it's completely fine to sit down and say for the quantum industry, it's going to be another 10 to 15 years to get to where Nvidia and all the other giants are.”
Quantinuum’s Hazra suggested that as quantum computers evolve, they will have to fulfill some of the same market requirements classical computers face, such as delivering reasonable (or better) performance per watt and performance per dollar.
He observed, “If you look at it through the lens of big problems, you want to fundamentally solve with the figure of merit [being] solving accurately and with less energy and cost, then we are getting to the point of what is your scale of computation? That's usually qubits, but also what your fidelity is and error rates you can sustain to make those qubits useful. I'm not saying there's a perfect ratio of those things now, but they're generally leading us to” useful quantum computers that will not necessarily replace classical computers.
But a more practical measurement of “useful” goes back to what Baratz said about the applications D-Wave is tackling now, and the applications other speakers highlighted during the session.
On that note Hazra added, “We're seeing applications today… We kind of focus on what is the big problem for a customer or a partner we want to solve.”