Perception software runs inside autonomous vehicle sensor

Artificial perception company AEye has announced reportedly the world’s first commercially available, 2D/3D perception system designed to run in the sensors of autonomous vehicles. This enables autonomous vehicle designers to use sensors to not only search and detect objects, but also to acquire and ultimately to classify and track these objects.

This in-sensor perception system is intended to accelerate the availability of autonomous features in vehicles across all SAE levels of human engagement, allowing automakers to enable the right amount of autonomy for any desired use case, thus providing autonomy “on demand” for ADAS, mobility and adjacent markets.

The software is based on the company’s flexible iDAR platform that enables intelligent and adaptive sensing. The iDAR platform is based on biomimicry and replicates the perception design of human vision through a combination of agile LiDAR, fused camera and artificial intelligence. It is the first system to take a fused approach to perception, leveraging iDAR’s Dynamic Vixels, which combine 2D camera data (pixels) with 3D LiDAR data (voxels) inside the sensor. This software-definable perception platform allows for disparate sensor modalities to complement each other, enabling the camera and LiDAR to work together to make each sensor more powerful, while providing “informed redundancy” that ensures a functional safe system.

AEye’s approach solves one of the most difficult challenges for the autonomous industry as it seeks to deliver perception at speed and at range, improving the reliability of detection and classification, while extending the range at which objects can be detected, classified and tracked. The sooner an object can be classified, and its trajectory accurately forecasted, the more time the vehicle has to brake, steer or accelerate in order to avoid collisions.

The intelligent capabilities of the iDAR platform allow for applications ranging from ADAS safety augmentation, such as collision avoidance, to selective autonomy (highway lane change), to fully autonomous use cases in closed-loop geo-fenced or open-loop scenarios.

Using the platform, engineers can experiment using software-definable sensors without waiting years for the next generation of hardware. They can adapt shot patterns in less than a second and simulate impact to find optimal performance. They can also customize features or power usage through modular design. Unlike with the industry’s previous generations of sensors, OEMs and Tier 1s can now also move algorithms into the sensors when it is appropriate.

“We believe the power and intelligence of the iDAR platform transforms how companies can create and evolve business models around autonomy without having to wait for the creation full Level 5 Robotaxis,” said Blair LaCorte, president of AEye, in a statement. “Automakers are now seeing autonomy as a continuum, and have identified the opportunity to leverage technology across this continuum. As the assets get smarter, OEMs can decide when to upgrade and leverage this intelligence. Technology companies that provide software-definable and modular hardware platforms now can support this automotive industry trend.”

AEye’s iDAR software reference library will be available in Q1 2020.