FIR sensor integration for autonomous vehicles

Sensors Insights by Yakov Shaharabani

To eventually achieve Level-3 and up autonomy and bring fully autonomous vehicles to the mass market, many AV developers have reached the consensus that each vehicle must be equipped with multiple sensors. Specifically, to enable complete detection and the most comprehensive understanding of vehicles' surroundings, automakers are favoring the use of multiple far infrared (FIR)q sensors, as this technology can deliver the highest level of safety.

With super detection in dynamic lighting, harsh weather, and night scenes, FIR can easily offer redundancy for safety for AVs. In addition to these impressive sensing capabilities, FIR technology, now experiencing a next-generation revolution for the automotive industry, is also uniquely affordable for mass market deployment. For these reasons, the market now believes that thermal sensors must be a part of autonomous vehicles’ sensor suites to deliver the highest levels of safety and the most accurate sight and perception of surroundings. 

For OEMs looking to integrate FIR sensors into their vehicles’ sensors suites, the principal questions are: What are the best use cases for FIR sensors? And when and how can FIR be fused with other sensing modalities to create a complete sensing solution?

 

Best Use Cases of FIR Sensors

FIR sensors deliver reliable, accurate detection in real-time and in any environmental condition. Unlike radar and lidar sensors that both transmit and receive signals, a FIR camera passively collects signals by detecting the thermal energy that radiates from objects. By sensing this infrared spectrum that’s far above visible light, FIR cameras access a different band of the electromagnetic spectrum than other sensing technologies do. Most of the electromagnetic spectrum is blocked by the atmosphere, with only narrow spectral windows that can let the EM radiation through. The visible light window roughly spans 400 to 700 nm wavelengths; whereas, the infrared window that is commonly used in low-cost thermal imagers spans 8 to14 μm wavelengths (also known as LWIR (long-wave infrared)). Thus, the FIR camera can generate a new layer of information, making it an all-weather solution that enables AVs to detect objects that may not otherwise be perceptible to radar, cameras, or lidar.

By accessing a different band of the electromagnetic spectrum and sensing objects’ thermal energies, FIR sensors are also able to seamlessly identify any living object in a vehicle’s surroundings. The technology also proves itself as the best modality for non-living object detection: In addition to reading an object’s thermal signature, FIR cameras also capture objects’ emissivity—the rate at which an object emits heat. Emissivity is affected by each object's surface material, so every object with a different surface nature (e.g. the cracks on the road versus the sidewalk) would bear a different thermal signature. Thus, since every material has a different emissivity (and a different reflectance (i.e. the proportion of radiation striking a surface that is reflected off it)), a FIR camera can immediately detect and classify any object—living or non-living—in its path.

Another strong use case for FIR lies in its ability to deliver invariant images for lighting conditions, i.e. its image perception is not compromised by the color of an object, an object's’ background, the direction of illumination, a multiplicity of light sources, specular reflections, nor many other image irregularities that may stem from variable lighting conditions in regular CMOS images. For instance, although CMOS cameras are usually quite good at detecting lanes and other road markings, they may struggle to accurately detect the drivable road area--even in daylight--due to the high variance in the visual road appearance in CMOS images. The visual road appearance in a FIR image is much less variable; it retains similar characteristics in many different lighting conditions. Delivering an invariant image for lighting conditions is critical for autonomous vehicles to be able to see their surroundings in the highest quality necessary to understand and react to both living and non-living objects.

 

When FIR Can Be Fused with CMOS for a Complete Sensing Solution

Although FIR’s superior image perception enables it to successfully provide coverage in a variety of adverse weather and lighting conditions and during pedestrians and animals (as compared to CMOS sensors), the technology’s proponents do not suggest that FIR replace all other sensors as the sole means of perception. Rather, they affirm that FIR be fused with a CMOS solution to deliver more comprehensive sensing capabilities needed to achieve full autonomy. FIR and CMOS can work in synergy on a variety of important tasks.

 

To Read Signs

Together, FIR and CMOS sensing solutions can help autonomous vehicles better see and understand street signs. To effectively respond to a traffic sign, a vehicle must first identify and separate the sign from the background and then read it. However, neither CMOS nor FIR can execute both tasks independently.

Because FIR functions by assessing an object’s thermal signature and emissivity, it cannot see colors in detail (i.e. it can struggle to read the sign). CMOS, on the other hand, cannot effectively detect thermally homogenic regions and separate them from the background—something FIR accomplishes without flaw. Using these two sensors in conjunction, then (in which the FIR sensor clearly identifies the traffic sign from the background, and the CMOS solutions reads the sign), allows the autonomous vehicle to best see and understand its surroundings.

 

To Classify Living and Non-Living Objects

FIR and CMOS can also be used in conjunction to achieve optimum object classification. Because it assesses objects’ thermal signatures, a FIR sensor is currently one of the sensing solutions most easily able to immediately distinguish a living object from a non-living object. For an autonomous vehicle of Level-3 or higher, this is crucial information for control and decision making. Conversely, for general detection of non-living objects, CMOS solutions can offer higher resolution than FIR--but only in good lighting and weather conditions. Thus, by combining the general object detection from a CMOS solution with the thermal information from a FIR sensor, automakers are better able to achieve a comprehensive understanding of all objects—both living and non-living—in a scene.

 

To Detect the Vehicle’s Driveable Area

In some circumstances, a CMOS solution is enough to provide an autonomous vehicle with adequate lane-detection; however, it may struggle to detect a vehicle’s entire driveable area accurately, as there is high variance in the visual appearance of the road in CMOS-generated RGB images--even in daylight. Moreover, a CMOS-based driveable area detection could be severely compromised in partial or complete darkness. A FIR sensing solution, on the other hand, produces a much less variant image of the road surface, i.e. the appearance of the road has similar characteristics in all environmental conditions. A fusion of FIR and CMOS sensing solutions, then, can deliver to autonomous vehicles a better system for driveable area detection and, consequently, any path-planning tasks.

 

FIR Integration—Affordable and Aesthetically Pleasing

Israeli startup AdaSky has been a distinct leader in the ongoing FIR revolution, as the startup has developed a FIR sensor economically suitable for mass market deployment. This has been possible due in part to the sensor’s design, size, and low power consumption, as well as its advanced, proprietary thermal calibration equipment and algorithms. AdaSky’s Viper is unique in that it does not require a shutter (which other FIR cameras typically do), because it uses the startup’s shutterless, non-uniformity correction, and because it is the only camera to apply a dedicated system on-a-chip ISP, accompanied by one of the smallest detector pitch and wafer level packaging technologies. AdaSky can make Viper as small as 26 mm and 44 mm long with 30.4° FOV, enabling the startup to deliver FIR technology at a low price and in a small enough size to facilitate installation almost anywhere on a vehicle.

While superior detection capabilities and affordable are undoubtedly necessary to bring Level-3 and higher autonomy to the automotive market, OEMs will not likely pursue technologies seriously if they are not aesthetically pleasing—at the end of the day, the consumer still must like the way the car looks. However, the autonomous vehicles being developed, and trialed today are not always as aesthetically pleasing as they are technologically advanced. For example, many of these models have massive lidar on the vehicles' roofs--an obtrusive design that OEMs will likely want to abandon if they are to bring autonomous vehicles to the greater public.

For the previous generation of FIR technology, aesthetic integration was difficult to achieve. Because the sensors were big, the only place they could be installed for both effective operation and decent appearance was a vehicle’s grill. Now, however, in the FIR revolution, new sensors like AdaSky’s Viper can be installed almost anywhere on the vehicle for complete detection, classification, and analysis in an aesthetically-pleasing integration.

Level-3 and higher vehicle autonomy can only be realized when vehicles are reliably able to see and understand their surroundings in any environmental condition. It’s become clear to automakers that these capabilities cannot be achieved by one sensor alone but, instead, necessitate the fusion of several different sensing modalities. As OEMs and tier-ones have begun trialing self-driving prototypes globally, FIR technology has become recognized as a key sensor vital to the success of vehicle autonomy.

Specifically, AdaSky’s Viper has emerged as one sensing solution most capable of delivering the perception and coverage needed to facilitate Level-3 and higher autonomy. Newly affordable for mass market deployment, today’s new-generation FIR technology satisfies key uses cases of AVs and offers the needed redundancy for safety. When fused with CMOS cameras, FIR technology becomes the complete, affordable sensing solution that will enable OEMs to bring Level-3 and higher autonomous vehicles to the mass market.

 

About the author

Yakov Shaharabani is the CEO of AdaSky. Yakov is a seasoned business executive and strategic thinker with extensive and proven experience leading large teams, navigating complex environments and driving successful outcomes even through highly tense situations. He comes to the technology world with more than 30 years of experience in the Israeli Air Force, having started as a young pilot and rising through the ranks to the most senior position as General of the Israeli Air Force. Yakov now leverages his decades of experience in the defense industry as a pilot flying helicopters and aircrafts with FIR (Far Infrared) thermal sensing, in an industry where it is an advanced and mature technology, to bring that knowledge to the automotive market.

An esteemed expert in strategic decision making and practical FIR sensing technology, Yakov has delivered presentations and been a guest speaker on panels about security, strategic military issues, and business strategy in both Hebrew and English at events across the US, Europe and Israel. Yakov also serves as the founder of SNH Strategies LTD, a company focusing on strategic consulting and leadership education and he has spent the past five years advising technology companies.

He earned his B.S with honors in economics and computer sciences, and M.A. in National Resource Strategy (Cum Laude), The National Defense University (NDU), Washington D.C., where he was the recipient of the NDU President Award for Visionary and Strategic Writing. He also a co-authored the book "Leadership – Agile nella Complesitta" (Leadership and Agility under Complexity), published in Italy, discussing leadership and strategy in complex environments. He has also served as the Senior Advisor for the Foundation for Defense of Democracies, as well as the Deputy Defense Attache to the US and Canada.