Phantom AI CEO Hyunggi Cho details 6-year ADAS moon shot

Applying computer vision and related AI technology for use in successful and safe self-driving vehicles ranks among the toughest challenges engineers face. 

The work is not as tough, perhaps, as sending a spaceship to Mars, or developing a new vaccine for the next catastrophic virus coming this decade. However, given the obstacles facing self-driving, NASA will probably have men and women on the surface of the Red Planet before many self-driving vehicles at the highest levels of SAE L4 and L5 ever hit US roadways.  

It’s not for lack of trying, of course. Even slick algorithms alone are not going to help deep learning experts as they confront the patchwork of state and local laws (or lack thereof) in the US, a job that government policy experts and lawyers at various carmakers are taking on in dribs and drabs.

Driving laws in the US are set by the 50 states and some cities while federal transportation officials try to set standards for safety and provide funds.  The US is not exactly the United States when it comes to driving laws or much else, really, but that’s an issue for another day. The point here is that good technology is one thing, but getting permission to use that tech on US roads, beyond just a test, is a very, very involved process that takes knowledge of laws, but also knowledge of tech, both at the same time, preferably. And, yes, it takes political stamina.

So, consider the poor Silicon Valley startup focused on doing what’s possible with technology through ADAS (Advanced Driver Assistance System), at L2 or L3, a couple steps below true self-driving, as it pitches OEMs to adopt its software.

Not exactly poor, of course, is Phantom AI of Mountain View, a group of 55 engineers that recently announced $36.5 million in a Series C funding round, bringing its total raised to $80.2 million since its founding in 2017.  

“This funding validates investor confidence in not only our mission to save lives through innovative visual perception technology but also our ability to execute on this mission,” said Hyunggi Cho, co-founder and CEO of Phantom Ai, in a statement.

head shot of cho
CEO Cho (Phantom AI)

Cho realizes that ADAS is a growing market, poised for continued growth, while venture investment in L4/L5 full automation for robo-taxis and autonomous trucking is being scaled back because it won’t be commercialized for years.  ADAS is a broad array of technologies already being realized in many car models with concepts such as brake assist that keeps a driver from plowing into a stopped car in front. Lane centering is another concept in recent models meant to warn drivers, or actually correct the vehicle, when it drifts to the left or right. Cruise control, widely available for years, also traditionally falls into the ADAS category.  Higher level L4 automation applies to vehicles that truly drive themselves in nearly all conditions using an array of sensors to perceive the road and other cars while relying on rich mapping datasets and GPS for navigation.

Phantom AI plans to use the new funding round to accelerate its current series of products with major OEMs.  These products focus on computer vision, sensor fusion and vehicle control software for automakers such as Hyundai, Volkswagen and other “potential customers,” as Cho described it in an interview with Fierce Electronics.  The company has won design contracts from two well-known vehicle makers and holds a production contract with a global OEM he would not name.  Phantom AI offers what Mobileye offers, “minus the chip.” Phantom AI software currently supports chips from Renesas, Texas Instruments and Nvidia and wants to port its vision stack to chips from Qualcomm and Samsung.

The Series C round reflects Phantom AI’s acceptance of the current realities in the market, with the difficulty of reaching higher level autonomy and even, somewhat, the difficulties in supporting ADAS.  “Our co-founder and I believe in building ADAS technology and the democratization of ADAS,” Cho said. “Reaching level 3 is by no means easy. In 2017, I thought ADAS was sexy and I was so passionate about it and felt this could be my entire life. Cruise control and lane centering was really doable and would have a meaningful impact.  But focusing on level 2 and 3 is not easy and it’s not easy because of the sensor constraint in real time on the car.”

Cho came from similar work at Tesla where progress on car automation had to be done quickly. “At Phantom, we cannot do that. It took not just three years, but six years,” he said. “We are hungry for more cash.  I thought we could make a production ready ADAS product in three years, but I didn’t have Tesla’s resources.”

Computer vision is a big field across many industries beyond automotive, and it is crowded with vendors, many that will consolidate or simply disappear, as happens with many startups.

“The field is crowded,” said Matt Arcaro an analyst at IDC, via email. “I think there was a broad realization that L4 timelines weren’t realistic based on what was proposed in 2015, 2016, 2017 and even 2018 by some of the major OEMs, as well as Tesla. This opened the door for new providers to deliver alternative, ADAS-first approaches to autonomy that worked within the traditional hierarchy of the OEMs and tiered suppliers.

“Phantom AI is a good example of this emerging ADAS startup, alongside more mature veterans like Mobileye, and the chip guys like Nvidia, Xilinx/AMD and Qualcomm and the research initiative built as part of an established tiered supplier or OEM.”

What gives Phantom AI hope for survival and growth?  Cho believes it is partly because 75% of the company’s engineers have automotive experience with a tier 1 or OEM, including production experience.

Second, the company’s AI and deep learning algorithm has better accuracy and more features for real time operation at 33 frames per second. “We can run deep learning faster than anyone, three or four times faster,” Cho said, and the work can be done with a $30 SoC with a dedicated accelerator, far cheaper than a $1,000 GPU.

Third, he said Phantom AI has superior tracking algorithms to estimate the position and velocity of an object detected in a 3D world. “Mobileye has amazing tracking performance as well,“ he said, but driving situations call for detection with deep learning alongside tracking that are well integrated. “So we have a nice position.”

While it might not be rocket science, Cho said Phantom AI’s work is right up there in levels of difficulty. “Automotive is challenging. It’s challenging to deliver software to cars, especially for safety reasons. Once we prove the product is rock solid, then I think we can get the next customer much easier.”  

RELATED: StradVision CTO Jack Sim on the future of self-driving