NASA’s Perseverance Rover Drives 456 Meters on Mars Using AI With No Human Input for Two Days

0
16
NASA’s Perseverance Rover Drives 456 Meters on Mars Using AI With No Human Input for Two Days
Illustration of NASA’s Perseverance rover navigating the Martian surface

NASA’s Perseverance rover recently completed a two-day drive across Mars without direct human control, marking a new step in autonomous space exploration.

During a December test, engineers allowed artificial intelligence to plan the rover’s route instead of relying on step-by-step instructions from Earth. Over two separate days, Perseverance traveled a combined 456 meters (1,496 feet) using AI-generated waypoints, navigating the Martian surface on its own.

NASA said the demonstration shows how advanced rover independence has become, especially as missions move farther from Earth. Signals between Earth and Mars can take about 25 minutes for a round trip, making real-time control impossible. Rovers must often make decisions without immediate guidance.

Signup for the USA Herald exclusive Newsletter

“Autonomous technologies like this help missions operate more efficiently, handle difficult terrain, and gather more science data,” said NASA Administrator Jared Isaacman.

Under normal operations, drivers on Earth study images and terrain maps, then send short driving instructions spaced about 100 meters apart. These commands travel through NASA’s Deep Space Network and are relayed by Mars orbiters to the rover.

For this test, AI took over much of that planning. The system analyzed high-resolution orbital images from the Mars Reconnaissance Orbiter’s HiRISE camera along with digital elevation models. It identified hazards such as loose sand, rocks, and steep terrain, then created a safe path using multiple waypoints.

Perseverance’s onboard auto-navigation software followed those instructions while moving, processing images and adjusting in real time. This allowed the rover to drive farther with fewer stops than traditional methods.

Before sending the route to Mars, NASA engineers tested everything using a full-scale twin of the rover at the Jet Propulsion Laboratory’s Mars Yard in California. This engineering model, known as the Vehicle System Test Bed, helps teams simulate drives and fix problems safely on Earth.

Vandi Verma, a roboticist on the mission, said generative AI is improving three key tasks: identifying obstacles, tracking the rover’s location, and planning safe movement. The aim is to support longer drives with less operator involvement.

One challenge remains. The farther the rover travels without human checks, the less certain it becomes about its exact position. NASA is working on AI tools that can re-match ground images with orbital photos to correct this drift automatically.

Researchers expect future missions to rely even more on AI. Concepts include rovers that travel kilometers at a time, fleets of drones scouting terrain ahead, and smart systems that flag interesting science targets without waiting for instructions from Earth.

NASA’s upcoming Dragonfly rotorcraft mission to Saturn’s moon Titan will also depend heavily on autonomous navigation and onboard decision-making.

As space missions push deeper into the solar system, engineers say these technologies will help robots work longer, travel farther, and collect more data without constant supervision. Perseverance’s recent drive offers an early look at how planetary exploration may operate in the years ahead.