JPL: Autonomously Alone on the Red Planet

JPL: Autonomously Alone on the Red Planet

When Ingenuity, the little helicopter that would, sprang from the Martian floor into the wispy skinny Martian environment, it knocked down all types of firsts. The first powered, managed flight on one other planet. The first autonomous flight. The first use of an inertial navigation system and visible odometry throughout an alien world.

To make this occur, NASA invested $85 million to construct Ingenuity, accommodate it onboard Perseverance for the lengthy interplanetary flight and parachute deployment, and function it as soon as it reached distant Mars.

There’s lots to marvel at on this endeavor, which took the fertile minds of NASA’s Jet Propulsion Laboratory (JPL), NASA Ames Research Center, NASA Langley Research Center and corporations that included AeroVironment, Inc. (see accompanying characteristic) on a six-year journey from inspiration to realization. Awe might be confined inside this text to the phenomenon of Ingenuity’s navigation system.


The Mars copter’s flight management system consists of 4 predominant subsystems: the Mode Commander, setting the general mode for the flight management system; the Guidance subsystem, offering reference trajectories for the flight path; the Navigation subsystem, giving estimates of the automobile state; and the Control subsystem, commanding the actuators primarily based on the reference trajectories and the automobile state.

The particular challenges for the navigation system onboard the UAV embody:

• A scarcity of worldwide navigation aids, equivalent to GPS or a robust magnetic area.

• A big communication time lag between Earth and Mars, stopping real-time communication throughout flight.

• A harsh radiation atmosphere that may adversely have an effect on computing parts.

Because of the time-lag problem, Ingenuity has to carry out by itself. Autonomously, in different phrases. Radio indicators from NASA Command take quarter-hour and 27 seconds to journey the 173 million miles (278.4 million kilometers) to Mars. Once on the floor, the extra well-endowed Perseverance rover served as a communications relay hyperlink so the helicopter and Mission Team on Earth might talk. It handed flight directions from NASA’s Jet Propulsion Laboratory in Pasadena, California, to Ingenuity. From a Martian hillock 65 meters away, the four-wheeled rover noticed and recorded its four-bladed offspring’s history-making flights.

While hovering on its 4 preliminary flights, the helicopter’s navigation digicam and laser altimeter fed data into the navigation laptop to make sure Ingenuity remained not solely degree, however inside the borders of its 10×10 meter airfield—a patch of extraterrestrial actual property chosen for its flatness and lack of obstructions. Because touchdown hazard avoidance was not prioritized for this know-how demonstration, every of these 4 preliminary flights started and ended inside an space that had been pre-inspected and decided to be protected by way of obstacles and floor slope.

Ingenuity carried out 5 flights in keeping with its programmed lifeline throughout a interval of 31 Earth days, or 30 sols on Mars. Then got here the shock ending-to-date, however extra on that later.

For the helicopter’s pre-arranged autonomous take a look at flights, underneath the NASA rubric of “know-how demonstration,” it took off, climbed, hovered, translated between a set of waypoints, then descended to land once more (see Figure 1). Although the helicopter did function independently throughout flight, the waypoints had been specified from Earth previous to flight.


This, nonetheless, raises an attention-grabbing and considerably refined level: is Ingenuity really autonomous?

It will depend on your definition. Engineers at AeroVironment, which constructed main parts of the helicopter however was not concerned within the steering, navigation and management (GNC) system design, weighed in on the difficulty.

“It actually is making autonomous choices [in managing rotor speed and pitch] to get extra cyclic to beat a wind gust,” stated Jeremy Tyler, senior aeromechanical engineer. “It’s managing its altitude, it’s managing its place, all by itself with none exterior intervention.”

“It’s inherently unstable,” added Matt Keennon, technical lead for rotor system growth. “It can’t fly for a half-second with out making choices primarily based on the inertial measurement unit [IMU] and driving the management system.”

“There’s no [navigation] choices being made onboard,” countered Ben Pipenberg, AeroVironment’s engineering lead on the Ingenuity venture. “When Perseverance landed, it used terrain-relative navigation, and it was making choices primarily based on exterior observable knowledge that it was gathering with out human enter. That could be an autonomous system. Ingenuity just isn’t doing that. It’s basically utilizing VIO—visual-inertial odometry—simply to navigate over the bottom in a pre-determined flight path, uploaded from Earth.”

Tyler concurred, after a trend. “It’s doing its personal easy autonomy. But actually no refined mission planning or decision-making.”


Engineers at JPL underneath the path of Håvard Grip, Mars helicopter chief pilot, developed and assembled the visual-inertial navigation system emphasizing robustness, however with a correspondingly restricted place accuracy and talent to navigate in advanced terrain. In explicit, the system assumes that options noticed by the navigation digicam lie on an approximate floor airplane with recognized slope. This is why the touchdown area was chosen and why the primary 4 flights didn’t enterprise past its bounds. The flights passed off over comparatively flat terrain, with short-term top variations on the order of 10% of the flight top.

The navigation sensors Ingenuity carries are:

• Bosch Sensortech BMI160 IMU, for measuring 3-axis accelerations at 1600 Hz and angular charges at 3200 Hz.

• Garmin Lidar-Lite-V3 laser rangefinder (LRF), for measuring distance to the bottom at 50 Hz.

• Downward-looking 640 x 480 grayscale digicam with an Omnivision OV7251 global-shutter sensor, offering photos at 30 Hz.

• MuRata SCA100T-D02 inclinometer, for measuring roll and pitch angle previous to flight.

All are industrial off-the-shelf (COTS) miniature sensors, largely developed for the mobile phone and light-weight drone markets.

Ingenuity additionally carried a second digicam, a 13-megapixel shade digicam with horizon-facing view for terrain photos, not used for navigation.

Figure 2 exhibits Ingenuity’s avionics system structure. A radiation-tolerant field-programmable gate array (FPGA) operate routes sensor knowledge and visitors between different computing parts and performs low-level actuator management. Most of the flight management software program is hosted on the flight laptop (FC).

A separate navigation laptop (NC), a 2.26 GHz quad-core Qualcomm Snapdragon 801 processor, gives the throughput for vision-based navigation. On the NC, one core is dedicated to camera-image processing and one other to the navigation filter, whereas the remaining cores are used for different exercise.

The visual-inertial navigation system gives the management system with real-time estimates of the automobile state: place, velocity, angle and angular charges. The state estimate relies on fusing data from the onboard IMU, inclinometer, LRF and navigation digicam.


“Before every of Ingenuity’s take a look at flights,” Grip advised Inside Unmanned Systems, “we uploaded directions describing exactly what the flight ought to seem like. But when it got here time to fly, the helicopter was by itself and relied on a set of flight-control algorithms that we developed right here on Earth, earlier than Ingenuity was even launched to Mars.”

When the copter rests on the bottom, making ready to take off, the inclinometer estimates preliminary roll and pitch angle. Based on this, preliminary estimates of the accelerometer and gyro biases are additionally obtained.

Once the automobile is in movement, integration of the IMU measurements is used to estimate modifications in place, velocity and angle. Only the IMU is used for this important second, measuring acceleration and angular charges. After the helicopter reaches 1 meter off the bottom, the laser rangefinder and downward-looking digicam are added to the navigation resolution. This precaution springs from pre-mission concern that the LRF and digicam may be obscured by mud kicked up by the copter blades. The IMU won’t output nice accuracy within the long-run, however as a result of Ingenuity takes solely a few seconds to achieve 1 meter, “we will make it work,” Grip stated. Ingenuity then begins utilizing its full suite of sensors.

During hover flight, Ingenuity on its semi-autonomous personal makes an attempt to take care of a continuing altitude, heading and place. The JPL staff has to depend on the copter’s estimates on how properly it performs this activity, as there may be restricted to no foundation for floor fact. But the accessible knowledge exhibits that Ingenuity holds its altitude extraordinarily properly in hover, to inside roughly 1 centimeter, and its heading to inside lower than 1.5 levels. Horizontal place can fluctuate as much as roughly 25 centimeters, which the staff attributes to wind gusts on the Red Planet.


Because of the comparatively low accuracy of MEMS-based IMUs, navigation aids should sure the expansion in navigation errors because the copter cruises. The LRF gives vary measurements between the automobile and the terrain beneath, giving vertical velocity and place. With the help of the MaVeN feature-tracking algorithm, the navigation digicam tracks visible options on the bottom, underneath the belief that each one options are positioned on a floor airplane with a recognized slope. This gives horizontal velocity in addition to roll and pitch angle, and helps restrict the drift in horizontal place and yaw angle.

However, the latter two measurements don’t have any absolute reference, and their estimates are topic to long-term drift. Therefore, shortly earlier than landing on the finish of every flight, a navigation digicam picture is saved for later transmission on Earth, in order that an absolute place and heading repair will be obtained by comparability to the recognized terrain.

“To develop the flight management algorithms,” Grip wrote in a NASA weblog put up updating Ingenuity’s followers, “we carried out detailed modeling and laptop simulation to be able to perceive how a helicopter would behave in a Martian atmosphere. We adopted that up with testing in an enormous 25-meter-tall, 7.5-meter-diameter vacuum chamber right here at JPL, the place we replicate the Martian environment. But in all of that work, we might solely approximate sure features of the atmosphere. Now that Ingenuity is definitely flying at Mars, we will start to evaluate how issues stack up towards expectations.”

The MAVeN navigation algorithm used “has no absolute references to any landmarks,” in keeping with Grip. “It all the time operates towards a base body the place it sees a bunch of options and tracks them over a restricted set of search frames. When it’s achieved, it requires a very new base body. It is all the time monitoring in a relative sense, by no means tied again to a worldwide body.

MAVeN is carried out as an Extended Kalman Filter (EKF) that additionally makes use of the distinction between the expected and measured LRF vary. MAVeN has a state vector with seven elements: place, velocity, angle, IMU accelerometer bias, IMU gyro bias, base picture place and base picture angle, for a complete of 21 scalar elements.

MAVeN solely tracks options between the present search picture and the bottom picture. Because the bottom body is often reset as options are misplaced, MAVeN is successfully a long-baseline visible odometry algorithm: the relative place and angle between the 2 photos are measured, however not absolutely the place and angle. Absolute place and angle error, on this case horizontal place and yaw, develop over time. The LRF gives vertical place, which bounds vertical place error. In addition, the visible options and flat-plane assumption present observability of absolute pitch and roll when the automobile is transferring.

A key benefit of MAVeN over different simultaneous localization and mapping (SLAM) algorithms is that the state solely must be augmented with six scalar parts—three for place and three for angle. The LRF and an assumed floor airplane allow MAVeN to estimate 3D place and velocity with out introducing a scale ambiguity.

The two predominant disadvantages of MAVeN are sensitivity to tough terrain, as a result of ground-plane assumption, and long-term drift in place and heading. For Ingenuity’s know-how demonstration part, that is a suitable tradeoff, as a result of accuracy degradation is swish and the algorithm has confirmed to be extremely strong in each simulation and experiments.

Feature detection in base photos is carried out with an implementation of the FAST algorithm [30], which selects corner-like options which have ample distinction between a middle pixel and a contiguous arc surrounding the middle pixel. An algorithm estimates the displacement of a template from one picture to the following, utilizing a gradient-based search algorithm that minimizes the distinction in pixel depth (see Figure 3).


Landing is an altogether delicate matter.

A speedy sequence of occasions takes place as Ingenuity descends towards the bottom. “First, a gradual descent price of 1 meter per second is established,” Grip wrote. “Once the automobile estimates that the legs are inside 1 meter of the bottom, the algorithms cease utilizing the navigation digicam and altimeter for estimation, counting on the IMU in the identical method as on takeoff. As with takeoff, this avoids mud obscuration, but it surely additionally serves one other objective: by relying solely on the IMU, we anticipate to have a really easy and steady estimate of our vertical velocity, which is vital to be able to keep away from detecting landing prematurely.

“About half a second after the change to IMU-only, when the legs are estimated to be inside 0.5 meters of the bottom, the landing detection is armed. Ingenuity will now contemplate landing to have occurred as quickly because the descent velocity drops by 25 centimeters per second or extra. Once Ingenuity meets the bottom, that drop in descent velocity occurs quickly. At that time, the flight management system stops attempting to manage the movement of the helicopter and instructions the collective management to the bottom doable blade pitch to provide near zero thrust. The system then waits 3 seconds to make sure the helicopter has settled on the bottom earlier than spinning down the rotors.”

The downward-facing digicam takes a number of photos on touchdown, which is factored into the sequence for subsequent takeoff.


Ingenuity’s deliberate technological demonstration was to final for 5 flights. Then, sadly, its pathbreaking life would come to an finish, its responsibility achieved. Its father or mother and trip to Mars, the four-wheeled Perseverance rover, would proceed for 2 extra years to discover the Jezero Crater, website of a lake 3.9 billion years in the past, in search of traces of historic microbial life. Ingenuity would perch immobile without end upon the Martian panorama, the lonely one.

But wait.

“On the final flight, we truly flew some other place,” Grip stated. “We had scouted that terrain beforehand with the helicopter.

“In that scouting flight, No. 4, we took photos utilizing the high-resolution return-to-Earth shade digicam. We might see on our goal airfield, particular person rocks, ripples, options, that we then georeferenced towards a low-resolution satellite tv for pc picture, so we knew precisely the place these options had been in a worldwide body. When we went again on flight 5, we might use these options to reference ourselves.”

Flight No. 5’s touchdown appeared nice, nearly as good because it might have been. Everything went in keeping with plan.

Then a momentous choice was made in Pasadena, to ship Ingenuity additional—into an operational demonstration part, very completely different, at a decrease cadence for helicopter operations. As the Mars Project focuses now on rover Perseverance and the science it delivers, “We’re in a background function,” Grip stated, “doing flights each two to a few weeks, to demo operational functionality, at greater danger, and centered extra on aerial imaging capabilities.

“These flights are stretching Ingenuity’s functionality by way of altitude, distance and velocity. We’ve coated our fundamentals, proven {that a} helicopter can fly on Mars, properly and confidently. We’re now stretching the parameters of these flights with the {hardware} and software program that we’ve on the helicopter.”

The elevated velocity over floor impacts the navigation system and the way the options the digicam is monitoring transfer by way of the sector of view. Additionally, new flights will break the parameter of flying over comparatively flat terrain. “We might fly over much less flat terrain, that can problem the navigation algorithm. How much less flat just isn’t factored in an specific method. We can take a look at the LRF knowledge after the very fact and analyze it, but it surely’s not being utilized in actual time to navigate the copter.”

“As we proceed with our flights on Mars,” Grip concluded, “we are going to preserve digging deeper into the info to know the assorted subtleties which will exist and could be helpful within the design of future aerial explorers. But what we will already say is: Ingenuity has met or exceeded our flight efficiency expectations.”