Optical Sensors Are A Key Technology For The Autonomous Car

Sensors Insights by Rajeev Thakur


In order to keep an eye on their surroundings and driver at all times and respond appropriately, autonomously driven cars will combine a variety of technologies in the future. While fully autonomous driving is still limited to prototypes today, the first partly autonomous cars are already in use on our roads. Optical sensors based on infrared lasers and LEDs are one of the key technologies for today's, and tomorrow's, intelligent systems to gradually ease the burden on the driver.

Boredom in slow-moving traffic or on the highway will be over soon. Instead, the car will independently take us to our destination, while we relax and devote ourselves to other things. For today's road users this is still wishful thinking, but numerous prototypes and sensational test drives are proof that car manufacturers, suppliers, and software companies are working on concepts for autonomous driving.

In order to enable the algorithms that direct the car to make correct and safe driving decisions, they must be extremely familiar with the driving situation and the car's surroundings. This data is collected by sensors, among other components, usually using radars, lasers, or cameras (see figure 1).

Fig. 1: All-round visibility for autonomous driving: In addition to radar, the measurements from cameras and laser sensors, in particular, will make autonomous driving possible in future.
Fig. 1: All-round visibility for autonomous driving: In addition to radar, the measurements from cameras and laser sensors, in particular, will make autonomous driving possible in future.

Many of these are already used today in individual driver assistant systems as the Stop-and-Go Assistant, Parking Assistant, Lane Keeping Assistant, or Emergency Brake Assistant. These systems are gradually gaining in autonomy and will initially facilitate partly automated driving, e.g. on highways, and will then lead to fully automated driving. On the sensor side, it is a question of perfectly combining the various technologies in order to cover as many functions as possible with the data and to have redundant data sources, as required due to security considerations.

Next page

Environment Detection Using Lasers

Laser sensors consist of lasers and detectors and they measure distances using the light propagation time; LIDAR or Light Detection and Ranging, (see figure 2). The sensor transmits a light pulse, which is reflected onto the detector by the object towards which the beam was directed. The distance between the object and the sensor is obtained from the time the light impulse needs to reach the object and come back. The range depends on the laser power, the visibility conditions and the reflectivity of the object. One of the first applications for which LIDAR was used in the car was intelligent cruise control, which measures the distance to the car ahead and adapts its own speed accordingly.

Fig. 2: Laser-Radar (LIDAR): A transmitted light pulse (green) is reflected onto an object - in this case the car in front - onto the detector. The distance between the object and car is calculated from the propagation time of the light pulse.
Fig. 2: Laser-Radar (LIDAR): A transmitted light pulse (green) is reflected onto an object - in this case the car in front - onto the detector. The distance between the object and car is calculated from the propagation time of the light pulse.

There are various types of LIDAR sensors. In the first type, a laser transmits short light pulses, which illuminate the entire scenery and use a simple detector array, usually a line, to register the spatially resolved signals. The system measures the distance to objects in the surrounding area and then uses the time sequences to ascertain how the objects move. Such laser systems typically cover an angle range of ±4° (vertical) and ±20° (horizontal).

Laser scanners, on the other hand, scan a very wide field of vision. They direct a focused beam by using a rotating mirror and use it to scan the scenery. The best-known example is the 360° laser scanner on the roof of the Google car.

Due to design considerations, the scanners are usually integrated in the car body. An example is the Audi demonstrator vehicle that drove from San Francisco to Las Vegas to the 2015 Consumer Electronics Show and travelled autonomously on all the highways. In addition to radar sensors and video cameras, it was equipped with one laser scanner in both the radiator grille and the rear.

Laser scans generally supply a scatter plot of the surrounding area, where every pixel is associated with the actual distance to the car. With their high angle resolution of less than one degree, laser scanners make it possible to identify objects and can distinguish, for example, between a rubbish bin and a pedestrian on the edge of the road. Laser scanners also register obstacles right in front of the car, a pedestrian's legs, for example, which makes them suitable as a supplementary system to radar sensors.

Next page

Lasers For LIDAR Within The Car

LIDAR systems use infrared LEDs with short switching times and high performance; 905 Nanometers (Nm) is a common wavelength. This spectral range is hardly noticeable for humans, while the detector is still sensitive to it. The typical optical pulse rating is about 25 Watt (W).

Osram was the first LED manufacturer to implement LIDAR sensors in the car with its pulse lasers more than ten years ago. To increase the performance per laser diode, Osram developed Nanostack Technology, in which three laser diodes are epitaxially stacked in one chip. For example, the impulse laser diodes SPL PL90_3 or the Smart Laser SPL LL90_3 supply an optical power of more than 75 W. The SPL LL90_3 (see figure 3) also has integrated driver electronics, which generate a high power pulse of about 50 A, thereby facilitating laser pulses with steep edges and pulse lengths of about 20 ns (see figure 3).

Fig. 3: LIDAR systems have been used in cars for more than ten years: The pulse laser diode SPL LL90_3 is equipped with an integrated driver to generate short pulses using high currents.
Fig. 3: LIDAR systems have been used in cars for more than ten years: The pulse laser diode SPL LL90_3 is equipped with an integrated driver to generate short pulses using high currents.

Avalanche photo diodes (APD) or the less expensive PIN photo diodes with fast switching times of a few nanoseconds (ns) are used as detectors. Suitable surface-mountable (SMT) PIN photo diodes with the necessary high sensitivity include the BPW 34S and SFH 2400. Automotive LIDAR systems have laser class one due to their short light pulses and are not detrimental to the human eye.

The next development step will be the transition to SMT types. Laser diodes are commonly used today in push-through housings, due to the fact that the edge emitting laser chips require novel SMT concepts among other things. A series of additional developments is conceivable for the future, depending on the application requirement.

Higher wavelengths of about 1550 nm, for example, could make it possible to improve the optical power while observing the eye safety standards. This spectral range would, however require new detector technology. LIDAR systems could also benefit from integrated sub-systems such as a combination between laser and driver electronics or between detector and ASIC, as they facilitate shorter switching times and a higher time-based measuring resolution.

Next page

Camera Systems With Infrared Auxiliary Lighting

Cameras form the basis of many parking assistants or lane assistants today. Intelligent image processing using camera images and videos makes it possible to capture the area surrounding the car in detail, right down to recognizing the traffic signs. The more autonomous driving becomes, the safer the interpretation of the camera images must be for decision-making. The best possible image quality is the prerequisite for this.

Additional illumination of the scenery using infrared light is therefore practical at dusk or during the night. Powerful, infrared LEDs (IRED) with an 850-nm wavelength are suitable as light sources. This spectral range is well detectable for camera sensors, but hardly perceivable for humans.

One of the first lighting applications for IREDs in the car were the camera-based Night Vision Assistants. They illuminate about 150 meters of road using infrared and generate a grey-scale image, which is superimposed on a display. To achieve the required brightness, the used IREDs must deliver very high optical power and be suitable for continuous operation with high currents.

Osram's Oslon Black SFH 4715A IRLED delivers about 800 mW of optical power with 1 A (see figure 4). Features include a typical efficiency level of 48% at 1A, and a thermal resistance of 6.5 Kelvin typically per Watt (K/W), allowing for continuous operation up to 1 A.

Fig. 4: Infrared light for good camera images: Oslon Black is currently one of the most powerful infrared LEDs with an 850- nm wavelength. It delivers 800 mW of optical power (SFH4715A) with 1A of current or even 1370 mW in the stack version (SFH 4715AS).
Fig. 4: Infrared light for good camera images: Oslon Black is currently one of the most powerful infrared LEDs with an 850- nm wavelength. It delivers 800 mW of optical power (SFH4715A) with 1A of current or even 1370 mW in the stack version (SFH 4715AS).

To achieve even brighter single transmitters, Osram transferred the Nanostack Technology to IREDs, thereby implementing a dual emitter. This meant the 1W mark had been cracked and the chip and housing generation achieved 1370 mW of optical power in continuous operation at 1A in the SFH 4715AS.

In pulse mode, up to 3A of current is permitted and it's possible to lower the thermal resistance to 5.5 K/W. The requirements for the reflected beam angle of IREDs differ depending on the various usage areas of the cameras in the car. This is due to the two variants of the Oslon Black. The 90° lens interacts particularly well with external lenses that shape the beam for the respective application. The 150° version is used to illuminate a large area in the vicinity and is ideal for implementing tight light cones and long ranges using reflector-based lenses.

Next page

Cameras

The use of cameras will also increase within the car interior. An automatically steered car must know what the driver is involved in so that it can direct its attention to the traffic on time, before it hands the control back to him in certain situations.

Today, the so-called Alertness Assistants especially monitor whether the driver is becoming tired. This occurs, for example, by analyzing his steering and pedal movements and by using the front camera to detect a typical drifting between the lanes. If one watches the driver's face with the camera, the blink frequency can be ascertained to recognize the onset of fatigue. Likewise, determining the viewing direction allows one to reach conclusions about the current attentiveness, for example, whether the driver is looking ahead at the road or if he is just distracted, in the event of danger.

The interior of the vehicle is not bright enough to be able to obtain a high image quality at all times of the day solely with the available light. This is why additional infrared illumination is required.

So as not to distract the driver, a longer wave transmitter with 940 nm is used instead of IREDs with 850 nm whose light is still perceived in the dark as a weak red beam. The requirements with regard to optical power in continuous operation are high, as a relatively large area has to be illuminated with the smallest possible lighting units.

Conclusions

The self-driving car of the future must be very well informed about its surroundings and its driver, in order to safely make the correct driving decisions. Optical technologies are playing an important role in capturing this information.

When developing new sensors, a high dynamic can be observed in order to measure additional details. Also sensor fusion, the merging of systems that largely worked individually in the past, will implement additional functions. This has resulted in new requirements with regard to the field of vision involved, the wavelengths, or the required optical power for the light sources.

Continuous development of light sources for optical sensors will play an important role on the road to automated, self-driving vehicles. The previously separate assistance systems will be merged into larger complete systems, changing the requirements that the separate sensors have to meet.

About the Author

Rajeev Thakur is a product marketing manager at OSRAM Opto Semiconductors. His current focus is on LIDAR, driver monitoring, night vision, blind spot detection, and other ADAS applications. Rajeev has prior experience in the Detroit automotive industry, working for companies such as Bosch, Johnson Controls and Chrysler. He holds an MS in Manufacturing engineering from the University of Massachusetts, Amherst and a BE in Mechanical engineering from Anna University in Madras, India. He is a licensed Professional Engineer and holds three patents on occupant sensing.

Related Stories

The Role of Time of Flight Imaging Technology in Next Generation Vehicles

Leddar Technology Enables New Mass-Market LiDAR Offering for Automotive Applications

The Connected Vehicle Is The 'MODEL T' Of Our Generation

Suggested Articles

President Trump issued his “blessing” of the tentative deal on Saturday and then directed a delay of a week of a ban on TikTok downloads.

U.S. Commerce Department beefs up attacks on TikTok and WeChat, banning downloads at first. Eventually the bans will be total, unless some other agree

Analyst Jack Gold describes the TikTok battle as a big win for China.