My Qualcomm Snapdragon test ride redux at CES 2023: Hamblen

A 20-minute ride in a (somewhat) self-driving sedan along a congested Interstate 15 and connected highways in Las Vegas during CES 2023 provided an updated insight into the state of mobile autonomy--at least in the US, which many describe as well behind China.

There’s the tech, of course, with hardware and ever faster processors and amazing AI software, but there’s also the way we talk about self-driving. It turns out a lot of respected engineers working in ADAS and self-driving pretty much hate having to conform to whether their cars are SAE level 2-plus, 3, 4 or 5.

 Many engineers simply want to show what their tech can do (automatic lane changing, acceleration, self-parking, and more) and stop using overarching labels.  That’s nothing against the Society of Automobile Engineers.

But back to that ride in Vegas:  Many people are already using Tesla’s Full Self-Driving and are probably well aware of how today’s newest cars with AI perform on a highway. We also know people have fallen asleep while cruising at 70 mph or much faster on a straight freeway in broad daylight using some form of self-driving tech.

In the demo I took, Qualcomm engineers were able to describe some differences from my Qualcomm Snapdragon test ride at CES 2020, mainly about data harvesting, but my actual ride was surprisingly similar to that earlier one. And that doesn’t mean Qualcomm has not been busy! For context, I have done many self-driving ride tests with safety drivers going back to my first in May 2016 in Singapore when MIT researchers were part of a team showing off the state of the art of self-driving in a small passenger vehicle with a safety driver. That day, we almost hit a construction crew on the side of the test pathway, but the vehicle automatically veered away quickly when it sensed an orange traffic cone and everybody was fine. But I digress…

RELATED: Qualcomm launches Snapdragon Ride for assisted and self-driving with test rides in Vegas

The CES 2023 test drive with updated Snapdragon Ride tech was done with two Qualcomm engineers, one seated at the wheel as a safety driver and the other behind him monitoring data.  We drove at legal speeds up to 70 mph on I-15 South to Highway 215 East and back again for about 20 minutes.  I was seated in the passenger seat and not allowed to take photos or video from the inside.

The Ride platform automation was not turned on until we reached the highways, and when it did turn on, the Lincoln MKZ with eight cameras and six radar sensors (and the Ride brain) performed admirably. Merging into an interstate is scary for most any driver and this feeling came up when we merged from an I-15 on-ramp into the main lanes directly next to a semi-tractor trailer barreling along at more than 60 mph in the middle lane. Our Lincoln stayed centered in the outer lane we had just joined while the big rig seemed to veer a bit to the right at the edge of its middle lane, right next to us. The massive truck and our vehicle were just inches apart, but there were no awkward adjustments.

 At one other point, we exited to an off-ramp at a speed I thought was too fast on a curve, but I attribute that to my conservative style of driving.  (I worry about driving around curves too fast when it is wet, after spinning out once years ago.  It was dry that day in Vegas but wet the next day and no tests were allowed.) Speed on curves, especially tight cloverleaf curves on freeways, might be something found in data sets used to train AI for self-driving and one wonders if, perhaps, the data sets could be biased unintentionally toward more aggressive and younger drivers.

After we returned to surface streets with automation turned off, we parked and got a quick run-down of the massive amount of real-time data collected during the short trip. We scrolled through thousands of data entry points and saw several highlighted anomalies that the AI had not apparently seen in the past.  One entry in the data was a front camera view from our vehicle of a parked car in the right lane with the driver door opened. It was likely the AI had seen opened doors on parked cars before, but maybe this one was slightly different.

That entry serves as an example of the nature of AI training: like the human brain, an AI machine with sensors can see a thing like a tree or a truck or child walking along the side of the road--or even a parked car with door open--and generalize whether it falls into the category of threatening or most likely benign. In this example, the Ride platform and all the associated bells and whistles decided, apparently, this object was not a threat and told our vehicle to keep moving without slowing or swerving away.   To be honest, I had not even seen the parked car with its door open.

It is important to note that our test ride took a planned route that had been programmed for use in multiple test drives with other people during CES 2023.  There was a very short time during the trip, for about 10 to 20 seconds at freeway speeds, when the safety driver dropped the vehicle out of the predicted course program, meaning the Lincoln was moving along on its own ability to sense and move.  Nothing bad happened, of course, but there were also no cars or crazy Vegas drivers veering in our way or curves or potholes to avoid.  Presumably, this is what Tesla offers with Full Self-Driving when drivers go on their own for entire trips, but it’s not possible to compare accurately.

Forced to put a label on it, Qualcomm called the test ride experience I took a test of Level 2-plus tech, going back to the SAE terminology.  Rajat Sagar, senior director of ADAS/AD stack products at Qualcomm, politely told me the SAE labels are “loaded” terminology, which was not a criticism of SAE as much as the various ways marketing and business interests and even government safety experts are using the terms. 

Tesla, for example, insists its Full Self-Driving drivers be in control of their vehicles, but that message apparently hasn’t gotten through, given well-publicized accidents. Federal safety officials are weighing what Tesla’s role should be in marketing its advanced tech, or more precisely, how its legal sales agreements describe it.  The bigger question for government and insurers will be, of course, how to protect the public in areas beyond the scope of legal agreements by Tesla or any other car company in a land where cars and fast driving are considered a right of free expression under the First Amendment. (This attitude is not an exaggeration of how many drivers feel.)

Matt Hamblen

Some agreement with Mobileye

Sagar is right of course about loaded terminology, and even Mobileye CEO Amnon Shashua told an audience at CES 2023 that his company is mostly not using the SAE labels to describe autonomous driving functions.  “Using level 2, etcetera, is good for engineers,” he said, implying it is not necessarily good for everybody else.

In a detailed presentation, Prof. Shashua described how Mobileye is using Operational Design Domain (ODD) and Minimum Risk Manueuver (MRM) approaches for designating differences in complexity between, say, robotaxi operations and average consumer self-driving expectations. “There’s no technology uncertainty with robotaxis,” he told the audience. “It’s more about building a healthy business.”

He also said the challenge of consumer AVs is “not about building moonshots” as he compared eyes off and hands off to L3 and L4 tech. As with other companies, Mobileye and Qualcomm have apparently  backed off, for now, the highest L5 ambitions where cars have no steering wheel and pedals and are able to sense their surroundings without a prescribed route in nearly all weather and terrain.  (Robotaxis could be an exception in some settings where GPS and crowd-sourced mapping are extensively used.)

Sashua put it well when he described the current state of self-driving. “Seven years ago, AV was just around the corner. Later, people said it would take 50 years. That [recent] concern has dissipated.”  He bragged how far Mobileye has come, reaching 233 car models launched in 2022 with Mobileye technology inside. Stellantis is a big supporter.

Back at Qualcomm, Sagar said the company is using more and richer data harvesting to provide a competitive difference.  “We work with all the tier 1 suppliers and almost all the OEMs,” he said. Qualcomm’s purchase of Arriver from SSW Partners last April has brought 1,000 engineers into the fold to provide ADAS to automakers, he noted.

Focus on digital cockpit

Marketing claims of providing the best chips and software for future self-driving tech are accelerating.  In addition to a large presence by Qualcomm Snapdragon and Mobileye at CES 2023, AMD chipped in with an announcement of a digital vehicle cockpit computing platform in a partnership with ECARX and smart while a promotional email said AMD is providing carmakers with ADAS, autonomous driving and networking tech to serve as a “one-stop shop for silicon and software” in the industry segment.

Mobileye and Qualcomm tried to outdo each other on CES 2023 blogs and press releases.  Mobileye tracked its design wins in 2022 and an expected $17 billion ADAS pipeline through 2030, among other achievements,   while Qualcomm named its various current achievements, including sampling of next-gen Snapdragon Ride and Ride Vision with suppliers for potential use in 2025 global production vehicles.

Both companies displayed cars on the show floor. Qualcomm showed off a concept Snapdragon Digital Chassis vehicle in a private room inside its booth with sleek detailing and an interactive display running across the width of the windshield.

silver car with doors open to show interior with digital chassis concept from  Snapdragon

One demo showed how gamers would be able to use two seat back displays to battle against a gamer on the front display with the capability to play third-party games. The company also announced its Snapdragon Ride Flex SoC, which allows digital cockpit, ADAS and automated driving to be implemented together on the same hardware.

These announcements and others show the work toward self-driving is growing, even if not as fast as once hoped.

Matt Hamblen is editor of Fierce Electronics and holds a perfect driving record, at least in recent years.