The Smart Radar Era Is Now

The Smart Radar Era Is Now

Sensors Insights by Dr. Bernard Casse

The next generation of high-resolution smart radar sensors for automotive is critical for achieving level 4 and 5 autonomy. Of course, cameras and LIDAR have an important place in the vehicle. Today’s radar has been useful, specifically for long-range detection and inclement weather conditions, or scenarios in which other sensors were blinded.


Now’s The Time To Push The Radar Performance Envelope

Innovation in smart radar is happening. Today’s charter in the automotive space is to build a new breed of imaging radars, capable of reconstructing its environment like LIDARs, and interpreting the world around it like humans, while outperforming LIDAR and camera in terms of range and operation in all-weather conditions. It is possible to achieve a new radar architecture that uses engineered metamaterial structures capable of beamforming and beamsteering, powered by an Artificial Intelligence (AI) engine that detects, recognizes, tracks, and classifies objects.

Fierce AI Week

Register today for Fierce AI Week - a free virtual event | August 10-12

Advances in AI and Machine Learning are adding an unprecedented level of intelligence to everything through capabilities such as speech processing and image & facial recognition. An essential event for design engineers and AI professionals, Engineering AI sessions during Fierce AI Week explore some of the most innovative real-world applications today, the technological advances that are accelerating adoption of AI and Machine Learning, and what the future holds for this game-changing technology.

The new generation in radar is on the cusp of being delivered. Moving beyond digital beamforming, smart radar, using metamaterials and AI, is critical for autonomous driving. Much education is needed, and this Q&A with Dr. Bernard Casse, CTO and co-founder of Metawave, will discuss an innovative approach to a new kind of radar platform, a few obstacles, and the vision of autonomous driving using the next generation in smart radar.


Why is this the Radar Era?

Stakeholders realize that timelines for autonomous cars to exist depend on both the maturity of the decision-making algorithms and performance of sensors. To date, there has been significant effort in improving LIDAR and the camera, especially for autonomous vehicles, but we’re hitting a performance threshold in terms of range and speed of operation. However, other sensors like radars, have been relatively unexplored. Radar has been the underdog because it lacks resolution, as compared to a camera or LIDAR, to interpret the world. But, car manufacturers now realize that the radar is the only sensor that can operate at long-range (> 200 m) and in all-weather conditions. It only suffers from a lack of vision and intelligence.

Figure 1

With advanced technology, we can restore vision and embed intelligence in radar. This is the radar era. There are about a dozen start-ups revisiting all aspects of automotive radar. We expect more start-ups and companies to emerge to tackle that space.


What does the CEO of a top auto company need to know?

Elon Musk is right in placing a lot of emphasis on radar. It is poised to remain the most robust and reliable sensor in the car. Many, including some carmakers, are familiar with old school, old style radar. When they think about radar, they think about a visually impaired sensor that can see blobs of metal (i.e., there is a big piece of metal here; there is a piece of metal there). This is true for current radar, but it doesn’t hold true in the case of next generation radar.

Point cloud imaging (analogous to a LIDAR) can actually map objects by rastering the beam, and relies on its algorithm to discriminate objects like road signs, automobile types, people, lamp posts, and more. With enhanced vision and speed, we can embed intelligence in our radar. That is, now that the radar has an enhanced ‘digital eye,’ it can learn to recognize objects’ specific features and associate them with a corresponding category.


Can you speak more about the importance of artificial intelligence?

AI is critical because it is the driving force behind our human-like decentralized intelligence. We are proponents of decentralized intelligence. We believe that the radar sensor and any other sensors should possess their own brains. The decision-making algorithms in the car should rely on both the sensor fusion (central intelligence), and individual sensors (decentralized intelligence).

This adds another layer of safety to the car. If your car sees a bridge, using the camera and/or LIDAR, now your radar should also be also saying: “Of course I'm seeing a bridge.” The notion of decentralized intelligence is even more important for radar, since it’s the only sensor that sees 300 m ahead, in order to be the earliest warning system. Autonomous driving is safer when it has multiple data points from various sensors, allowing the car to receive information and perceive its surroundings.


What is the state of today’s radar, especially for autonomous driving?

Today’s state-of-the-art radar is 100% digital beamforming (DBF). The automotive industry abandoned analog radars (e.g., phased array antennas), because of its exorbitant cost, power-hungry architecture and complexity, even though performance-wise analog radars remain superior. The automotive industry remains cost sensitive, and since the cost of signal processing will go down eventually, just as other computation on microchips (Moore’s law), DBF remains an attractive option.

But DBF’s resolution and speed remains limited. The military, which is more focused on performance, still deploys analog radars for ballistic missiles detection and tracking. With autonomous driving, cost is not, and should not be the top priority. Performance and safety are at the very top of the list.


What are the limitations with digital radar?

DBF suffers from three main disadvantages that are somewhat interrelated:

  1. It is slow. DBF takes on the order of milliseconds for scanning a scene. Signal processing in the digital domain is non-trivial. Milliseconds of integration time is required to achieve an acceptable signal-to-interference-plus-noise ratio (SINR). Processing using a host of complex analog-to-digital circuitry and assigning digital weights for performing DBF results in a higher computational effort, leading to sluggishness.
  2. DBF lacks resolution — it can’t see narrow objects or pedestrians. It is not a “true” beamforming architecture per say, and beamforming is impractical. That is, it needs a large number of antennas to achieve high resolution. First, implementing many antennas is very costly and requires multiple ports (non-traditional) on radar chipsets. In addition, this would require more snapshots to get to an acceptable SINR (since the noise coming omni-directionally would overwhelm the system). With the traditional three transmit ports and 4 receive ports, the resolution is not clear enough to see pedestrians. It works for cars (nice tradeoff between resolution and SINR and range), but not for non-metallic objects.
  3. It produces ghost images. DBF is sensitive to highly correlated signals, i.e., DBF enhances the noise coming from multi-paths signals, and produces ghost images.


What are the innovations in radar now, specifically in the automotive space?

First, there is the usual incremental engineering: the tier 1 and 2 are focused on pushing the envelope in MMIC technology. The real out-of-the-box innovations are coming from start-ups: we’ve seen exotic technologies like Luneburg lens antennas, new algorithms suppressing interference in DBF, to 3D printed arrays, to name a few. As we venture into the radar era, we anticipate more radar companies to emerge in 2018, and create more innovations. Though, we expect some to crash and burn, a few technologies will survive.


What are the obstacles when implementing a new kind of radar?

The frequency window from 76 GHz to 81 GHz is the first big one, because it’s all in the millimeter wave (mm-wave) regime. The components and materials are not readily available, and off-the-shelf components have not been characterized yet. A fair amount of forensic work is needed. And, we’re good at addressing this and other challenges, and we’ve already made significant progress in a very short time.

The obstacle of operating at high frequencies is not a showstopper, and, in fact, it’s actually a blessing; it makes it non-trivial for the competition to catch up, and gives us a leading edge. There are other obstacles that we can’t talk about at this early stage without revealing too much about the platform, but all of them are manageable. So far, nothing has been insurmountable. We are just at the beginning of this exciting journey, having launched earlier this year, but what we are showing, and soon shipping, is very exciting to those interested in seeing level 4 and 5 self-driving cars on the roads in just a few short years.


When do you think we’ll see these radar platforms deployed in autonomous vehicles?

All the top automakers are predicting that we will have level 5 self-driving cars by, latest, 2025. We are a fast-paced start up in the midst of an exciting disruption in this industry. We’re going to deliver a product next year, and we’re talking to all the top automakers and strategic partners in the autonomous driving space. I strongly believe in order to make level 5 cars as safe as they need to be, the next generation in smart radar – both hardware and software – must be deployed in them.


What is Metawave's approach to building a new kind of radar platform?

Metawave’s unique approach lies in the use of adaptive metamaterials and AI. The overarching goal is to build superior radars possessing vision, speed, and intelligence. In our scheme, we are using an all-electronic tunable metamaterials platform to replace costly and power-hungry phase shifters.

The idea, from a hardware perspective, is to remove the elements that have the most cost, complexity, and weight, while providing the same level of performance as the analog phase shifters used by the military today. On the software front, we are developing powerful hardware control, signal processing and decision-making algorithms tailored specifically to our hardware.

Figure 2

We are leveraging the large body of works in AI, and building on top of it to create our own custom AI engine: Warlord Radar AI Thinking like Humans (WRATH) for augmenting perception and early warning systems, e.g., predictive analytics.


About the author

Dr. Bernard Casse is a seasoned technical leader and strategist with a demonstrated track record of developing innovative technologies, leading world-class teams, and over delivering on complex multidisciplinary projects. With 10+ years post-PhD experience in securing, leading and managing multi-million dollar applied R&D contracts from both government agencies and the private sector, Bernard’s current role is to establish Metawave's technical vision, recruit talent, and lead aspects of the company’s technological development. His current duty also involves directing the company’s strategic direction, development and future growth. Prior to co-founding Metawave, Bernard was an Area Manager at PARC, a Xerox company. Dr. Casse can be reached at [email protected].

Suggested Articles

Hydrogen refueling stations are limited in the U.S., restricting interest in use of fuel cell electric cars

Silicon Labs is providing the BT module needed for detecting proximity with another Maggy device

Test automation won't fix everything, but can help, according to an automation engineer. Here are five problems to avoi to improve chances of success