For five years, NASA’s Perseverance rover has been doing something almost painfully relatable. Every time it got a little lost on Mars, it stopped driving, snapped a panoramic photo, and essentially called home to ask for directions. Given that home is roughly 140 million miles away, that phone call could eat up an entire Martian day.
Not anymore.
Engineers at the Jet Propulsion Laboratory have flipped on a new capability called Mars Global Localization, and it changes how the rover operates. Perseverance can now match its own camera imagery against orbital terrain maps stored onboard, figure out exactly where it is to within about 10 inches, and keep rolling. The whole process takes roughly two minutes, no Earth-based hand-holding required.
“Imagine you’re alone in a vast desert, with no roads and no maps, and you only get one phone call a day to ask, ‘Where am I?'” said Vandi Verma, a JPL space roboticist. “That’s what NASA’s Perseverance rover has had to do on Mars for five years.”
The problem was always accumulating error. Perseverance tracked its position by photographing geological features every few feet and accounting for wheel slippage in the Martian dust. But small mistakes compounded, and on longer drives, uncertainty about its location could balloon past 100 feet.
If the rover calculated it might be too close to hazardous terrain, it would freeze in place and wait for JPL to radio back confirmation that it was safe to proceed. That back-and-forth was a brutal bottleneck. The rover’s autonomous hazard-avoidance system, called AutoNav, was actually quite good at steering around rocks and slopes, but location uncertainty, not obstacles, was the real leash.
Here’s where the story gets wonderfully scrappy. The computing muscle behind this wasn’t some purpose-built space processor. It was a Qualcomm Snapdragon 801, a smartphone chip from 2013 that Perseverance carried specifically to communicate with Ingenuity, the little helicopter drone that hitched a ride to Mars.
Ingenuity was only supposed to fly five times. It flew 72 before snapping a rotor blade in early 2024.
With Ingenuity grounded, someone at JPL had a lightbulb moment: that Snapdragon chip was just sitting there, doing nothing. Meanwhile, the rover’s primary radiation-hardened processor dates back to 1997. The Snapdragon, though unshielded against cosmic radiation, offered a massive leap in computing power, so why not put it to work?
They did. And it worked. JPL tested the localization algorithm against imagery from 264 previous rover stops, and the software nailed the location every single time.
There’s a catch, of course. That unhardened chip is slowly getting chewed up by solar radiation, and NASA has already detected damage to 25 processing bits. Perseverance isolated the corrupted sections and carried on, living up to its name, but the degradation will worsen over time.
Still, the implications are enormous. This upgrade arrived just weeks after Perseverance completed its first drive fully planned by generative artificial intelligence, which independently assessed terrain hazards and plotted a safe route using orbital imagery. Stack autonomous navigation on top of autonomous route planning, and you’re looking at a rover that barely needs human oversight for daily operations.
“We’ve given the rover a new ability,” said Jeremy Nash, the JPL robotics engineer who led the project. “This has been an open problem in robotics research for decades, and it’s been super exciting to deploy this solution in space for the first time.”
The technology isn’t Mars-specific either. Verma says the algorithm could work for virtually any rover moving fast and far, whether that’s on the Moon, Mars, or somewhere else entirely. Future missions could spend their time doing actual science instead of waiting hours for a position fix from a planet hundreds of millions of miles away.
A broken helicopter, a recycled phone chip, and some clever engineering. That’s how you build GPS on another world.





