At the height of the Iraq War, DARPA — the Defense Advanced Projects Agency — sponsored a car chase from Barstow California to Southern Nevada. The intent was to simulate the same route and terrain as the road from Baghdad to Al Fallūjah. How much safer, DARPA planners believed, to send autonomous vehicles rather than human targets from one location to the other. University engineering student teams, at the behest of DARPA, completed the course in 2005, winning a million dollars for their school, and dramatically elevating the posture of autonomous vehicles to the American public. https://www.darpa.mil/news-events/2014-03-13
Now, more than 10 years past that Challenge, the focus on self-driving cars has elevated scientific interest in on-road object-detection systems. The most highly touted systems use a form of radar, in which a series of nigh-frequency pulses are projected toward a distant object and bounced back. Information on the size and shape of the distant object, as well as its movements, are computed from the distortions of the bounced-back signal. With the help of DSP processors, the radar system can calculate how far the distant object is from the transmitter. Technologies like LIDAR (Light Detection and Ranging) and RADAR (Radio Frequency Signal Detection and Ranging) are being developed for self-driving vehicles. Developers are making good progress, if you go by the LiDAR presentations at Sensors Expo in San Jose later on this month (see list of presenters).
Radar vs LiDAR
As Sensors Magazine reporter, Anne Neal, points out, there are competing bounce-back technologies — LiDAR and RADAR — and both will likely have a role in the autonomous vehicles coming into commercial use in the not too distant future. Ride-sharing services like Uber will likely utilize LiDAR-based scanning, while Tesla, says Neal, will likely utilize Radar technology. https://www.sensorsmag.com/components/lidar-vs-radar
On the road, it is important to recognize whether an object ahead is a pedestrian, a car, or a brick wall. Advanced Driver Assist Systems (ADAS) will allow the system to predict its movement with onboard software, though it will information on objects less than 500 meters ahead.
The reality is that the modern car must utilize a variety of scanning technologies — some favoring long-distance object detection; others seeking sweaty palms and bad breath. The applications are categorized by the distances from transmitter to sensor targets, says Texas Instruments’ engineer Brian Shaffer. https://e2e.ti.com/blogs_/b/behind_the_wheel/archive/2017/10/25/why-are-automotive-radar-systems-moving-from-24ghz-to-77ghz
- Mid- and Long-Range radar (e.g., for adaptive cruise control, emergency braking, and otherwise automated highway driving)
- Ultra-Short and Short-Range radar (for blind spot detection, rear collision avoidance, pedestrian/cyclist detection) add
- Proximity sensing (for driver monitoring
At the same time that demand for automotive radar sensors is increasing, says Shaffer, the sensors themselves are changing. We’re entering an age of millimeter wave signal detection and processing, driven by a shift in sampling spectrum from 24GHz to 77GHz. This will enable — so many other semiconductor advances enable — better accuracy and image resolution from sensor devices in a smaller package [ ]