How Perseverance landed on Mars with its terrain nav system

The Perseverance landing on Mars in February included 20 nerve-wracking seconds as the rover descended to the surface via parachute and relied solely on a Terrain Relative Navigation (TRN) system to find a safe landing spot in Jezero Crater.

It all worked out well, but Perseverance mission engineers at NASA JPL were on pins and needles until then, including Dr. Andrew Johnson who headed up development of TRN over six years.

“You worry did you do it right,” he said in a keynote interview that kicked off Sensors Innovation Week: Summer Editor on Tuesday. (See full video interview embedded below.) “Prior to landing, all of us were very tense, as expected, for months beforehand. It worked better than we even thought it would. We’re quite happy.”

Johnson is principal robotic systems engineer for NASA JPL where he first started in 1997 after completing a PhD in robotics at Carnegie Mellon University.

He offered an in-depth view of how TRN worked on its first try on Mars. There was never a way to prove it would work completely in a real-world test on Earth because it would have been too expensive to use an actual rocket.

Engineers instead relied on hundreds of simulations of how its inertial measurement unit (IMU) and a specially-built computer and a camera would perform.  Pilots did fly a real helicopter 2.6 miles above the California desert – so high they needed oxygen – to field test how the lander vision system would perform at estimating position.

Even though things worked well, Johnson revealed some insights regarding the February 18 landing that were not widely reported at the time.

“We expected to land in the middle of the landing ellipse, a very flat and benign region next to these cliffs created by a delta, but what happened was the wind blew us southeast on the parachute,” he explained. The TRN system picked the safest site it could reach with the remaining fuel on board.

“The safest site…was quite a ways from the intended target, which was totally fine and within acceptable,” he said. But there were still large and sharp boulders nearby.  In its final resting place, the rover sat down on an area with some small rocks.

The way TRN works is by comparing what the downward facing camera sees, processing an image per second over 20 seconds while on the parachute with the help of an FPGA and the processing of the special computer and the IMU.  Those images of landmarks would be compared with landmarks on maps stored on NAND aboard Perseverance that had been taken previously by Mars orbiters.  Johnson played a short movie showing how the computer compared pre-recorded surface images with real-time images, marking matches with blue and green squares for prime landing locations.

Scientists knew in advance that Jezero had a “large number of hazards, including steep cliffs, boulder fields and dunes you can’t get out of.  We had to build a system to avoid the hazards once you knew where you are.”

All of these choices of landing spot were made my Perseverance autonomously, since communications with Earth from Mars take 11 minutes (since Mars is 222 million miles from Earth), and the rover traveled from the top of the atmosphere to the surface in seven minutes.  “There’s no way to joystick a landing on Mars,” he said. “All of it has to be autonomous and has to work the first time since we never really tested it on Earth.”

Engineers  has previously loaded maps of the surface taken by the orbiters that showed hazards over a 20 kilometer by 20 kilometer area, just in case something like wind could force Perseverance off course.

TRN’s job on Mars is not done. Engineers have repurposed the Perseverance computer for stereo vision from cameras on the mast of the rover to be fed into its computer to show how far it has moved for autonomous navigation as it drives around hazards.  TRN has helped Perseverance drive to locations three to five times faster than Curiosity, a previous rover on Mars.

TRN will be used on future Mars missions for landings, and can be adapted for landings on the Moon, comets or asteroids.  An obvious application is for autonomous flying drones on Earth for package delivery as a backup for GPS coordinates. Commercial companies like SpaceX are planning to land on the Moon where there is no GPS, but an accurate landing is essential.

TRN relies on comparing real-time images with recorded maps, but in some cases there are no maps.  The Cassini mission mapped Titan, a moon of Saturn, where the Dragonfly drone will land in a future mission, but the maps were not considered high resolution. Europa, a moon of Jupiter, will be visited by the Europa Clipper, which will orbit the icy moon to investigate whether it has conditions suitable for life.

The primary mission of Perseverance on Mars is to drive to different locations to collect dust and rock samples that will be eventually be returned to Earth for analysis to seek evidence of ancient microbial life on the Red Planet. Those future missions could be as soon as 2026.

RELATED: Dragonfly drone to fly on Titan in follow-up of Ingenuity on Mars 

Editor’s Note: Andrew Johnson’s keynote interview led off the first day of Sensors Innovation Week: Summer Edition which continues through Thursday.  Registration is free and his talk and many others are available both live and on-demand.