High performance LiDAR is key to safe autonomy

Long range LiDAR systems are essential for obtaining the type of resolution at range demanded for safe, Level 3-plus autonomy. However, these LiDAR systems must also be high performance, adaptive, and software-configurable to meet the needs of specific OEM use cases and mounting preferences.

Next generation sensors expand upon LiDAR’s intrinsic value to bring agility to the sensor - addressing the industry’s biggest pain points and toughest corner cases.

Software configurable, high performance LiDAR helps OEMs tackle specific use cases for L3+  highway and urban scenarios.

Two leading use cases include trucking and highway passenger vehicles. Although they carry different speed constraints and range detection requirements, both use cases require adaptive targeting and intelligence in the data collection process.

For trucking, a LiDAR sensor system requires long-range detection capabilities in forward view at highway speed,, as well as the ability to perform in low speed, highly complex and dynamic scenarios. Conventional LiDAR systems not only fail to achieve a far enough detection range, they don’t have the agility to adjust their scanning capabilities to optimize for these different use cases.

For autonomous passenger vehicles driving on a highway, a LiDAR system must be able to quickly and effectively detect and respond to cut-ins, small objects/hazards, and potholes, while keeping track of other vehicles on the road at all times and performing safe lane changes and lane merges. This requires the sensor to increase and place resolution where needed throughout a scene to improve the probability of detection and the accuracy of classification.

Built on solid-state, adaptive LiDAR, AEye’s high-performance 4Sight sensors are modular, AI-driven and software configurable, enabling them to respond to changing environmental conditions and to be optimized for every use case in automotive, trucking and beyond.

Software configurability also allows for flexible, discreet sensor placement around the vehicle, including: in the grill, behind the windshield, on the roof, or side mounted (forwards and backwards).

AEye makes LiDAR that mimics how the human visual cortex conceptually focuses on and evaluates the environment, driving conditions, and road hazards, providing full field-of-view coverage while collecting and analyzing only the data that matters – without missing anything. Ultimately, this approach allows the system to capture more intelligent information with less data—enabling faster, more accurate, and more reliable sensing and path planning —key to the safe rollout of autonomous vehicles

Indu Vijayan is director of product management at AEye. She has also been a software engineer for the autonomous driving team at Delphi/Aptiv and holds a master’s in computer engineering from Stony Brook University. She will speak on high performance auto LiDAR at Sensors Converge on Tuesday Sept. 21 at 11 a.m. PDT and will appear on a panel on autonomous technologies later at 3:15 p.m. PDT.