Inside the Ingenuity Helicopter: Teamwork on Mars

Inside the Ingenuity Helicopter: Teamwork on Mars

When Ingenuity, the little helicopter that might, sprang from the Martian floor into the wispy skinny Martian ambiance, it knocked down all types of firsts. The first powered, managed flight on one other planet. The first autonomous flight. The first use of an inertial navigation system and visible odometry throughout an alien world.

To make this occur, NASA invested $85 million to construct Ingenuity, accommodate it onboard Perseverance for the lengthy interplanetary flight and parachute deployment, and function it as soon as it reached distant Mars.

There’s loads to marvel at on this enterprise, which took the fertile minds of NASA’s Jet Propulsion Laboratory (JPL), NASA Ames Research Center, NASA Langley Research Center and firms that included AeroVironment, Inc. (see accompanying function) on a six-year journey from inspiration to realization. Awe can be confined inside this text to the phenomenon of Ingenuity’s navigation system.


The Mars copter’s flight management system consists of 4 principal subsystems: the Mode Commander, setting the general mode for the flight management system; the Guidance subsystem, offering reference trajectories for the flight path; the Navigation subsystem, giving estimates of the car state; and the Control subsystem, commanding the actuators based mostly on the reference trajectories and the car state.

The particular challenges for the navigation system onboard the UAV embrace:

• A scarcity of worldwide navigation aids, comparable to GPS or a robust magnetic subject.

• A big communication time lag between Earth and Mars, stopping real-time communication throughout flight.

• A harsh radiation setting that may adversely have an effect on computing parts.

Because of the time-lag problem, Ingenuity has to carry out by itself. Autonomously, in different phrases. Radio alerts from NASA Command take quarter-hour and 27 seconds to journey the 173 million miles (278.4 million kilometers) to Mars. Once on the floor, the extra well-endowed Perseverance rover served as a communications relay hyperlink so the helicopter and Mission Team on Earth might talk. It handed flight directions from NASA’s Jet Propulsion Laboratory in Pasadena, California, to Ingenuity. From a Martian hillock 65 meters away, the four-wheeled rover noticed and recorded its four-bladed offspring’s history-making flights.

While hovering on its 4 preliminary flights, the helicopter’s navigation digicam and laser altimeter fed info into the navigation laptop to make sure Ingenuity remained not solely stage, however throughout the borders of its 10×10 meter airfield—a patch of extraterrestrial actual property chosen for its flatness and lack of obstructions. Because touchdown hazard avoidance was not prioritized for this expertise demonstration, every of these 4 preliminary flights started and ended inside an space that had been pre-inspected and decided to be protected when it comes to obstacles and floor slope.

Ingenuity performed 5 flights in response to its programmed lifeline throughout a interval of 31 Earth days, or 30 sols on Mars. Then got here the shock ending-to-date, however extra on that later.

For the helicopter’s pre-arranged autonomous check flights, beneath the NASA rubric of “expertise demonstration,” it took off, climbed, hovered, translated between a set of waypoints, then descended to land once more (see Figure 1). Although the helicopter did function independently throughout flight, the waypoints had been specified from Earth previous to flight.


This, nonetheless, raises an fascinating and considerably refined level: is Ingenuity actually autonomous?

It is dependent upon your definition. Engineers at AeroVironment, which constructed main parts of the helicopter however was not concerned within the steering, navigation and management (GNC) system design, weighed in on the difficulty.

“It actually is making autonomous choices [in managing rotor speed and pitch] to get extra cyclic to beat a wind gust,” mentioned Jeremy Tyler, senior aeromechanical engineer. “It’s managing its altitude, it’s managing its place, all by itself with none exterior intervention.”

“It’s inherently unstable,” added Matt Keennon, technical lead for rotor system growth. “It can’t fly for a half-second with out making choices based mostly on the inertial measurement unit [IMU] and driving the management system.”

“There’s no [navigation] choices being made onboard,” countered Ben Pipenberg, AeroVironment’s engineering lead on the Ingenuity venture. “When Perseverance landed, it used terrain-relative navigation, and it was making choices based mostly on outdoors observable knowledge that it was accumulating with out human enter. That could be an autonomous system. Ingenuity shouldn’t be doing that. It’s basically utilizing VIO—visual-inertial odometry—simply to navigate over the bottom in a pre-determined flight path, uploaded from Earth.”

Tyler concurred, after a style. “It’s doing its personal easy autonomy. But actually no subtle mission planning or decision-making.”


Engineers at JPL beneath the path of Håvard Grip, Mars helicopter chief pilot, developed and assembled the visual-inertial navigation system emphasizing robustness, however with a correspondingly restricted place accuracy and talent to navigate in complicated terrain. In explicit, the system assumes that options noticed by the navigation digicam lie on an approximate floor airplane with recognized slope. This is why the touchdown subject was chosen and why the primary 4 flights didn’t enterprise past its bounds. The flights came about over comparatively flat terrain, with short-term top variations on the order of 10% of the flight top.

The navigation sensors Ingenuity carries are:

• Bosch Sensortech BMI160 IMU, for measuring 3-axis accelerations at 1600 Hz and angular charges at 3200 Hz.

• Garmin Lidar-Lite-V3 laser rangefinder (LRF), for measuring distance to the bottom at 50 Hz.

• Downward-looking 640 x 480 grayscale digicam with an Omnivision OV7251 global-shutter sensor, offering photographs at 30 Hz.

• MuRata SCA100T-D02 inclinometer, for measuring roll and pitch perspective previous to flight.

All are business off-the-shelf (COTS) miniature sensors, largely developed for the mobile phone and light-weight drone markets.

Ingenuity additionally carried a second digicam, a 13-megapixel coloration digicam with horizon-facing view for terrain photographs, not used for navigation.

Figure 2 exhibits Ingenuity’s avionics system structure. A radiation-tolerant field-programmable gate array (FPGA) perform routes sensor knowledge and visitors between different computing parts and performs low-level actuator management. Most of the flight management software program is hosted on the flight laptop (FC).

A separate navigation laptop (NC), a 2.26 GHz quad-core Qualcomm Snapdragon 801 processor, offers the throughput for vision-based navigation. On the NC, one core is dedicated to camera-image processing and one other to the navigation filter, whereas the remaining cores are used for different exercise.

The visual-inertial navigation system offers the management system with real-time estimates of the car state: place, velocity, perspective and angular charges. The state estimate relies on fusing info from the onboard IMU, inclinometer, LRF and navigation digicam.


“Before every of Ingenuity’s check flights,” Grip advised Inside Unmanned Systems, “we uploaded directions describing exactly what the flight ought to appear to be. But when it got here time to fly, the helicopter was by itself and relied on a set of flight-control algorithms that we developed right here on Earth, earlier than Ingenuity was even launched to Mars.”

When the copter rests on the bottom, getting ready to take off, the inclinometer estimates preliminary roll and pitch perspective. Based on this, preliminary estimates of the accelerometer and gyro biases are additionally obtained.

Once the car is in movement, integration of the IMU measurements is used to estimate adjustments in place, velocity and perspective. Only the IMU is used for this essential second, measuring acceleration and angular charges. After the helicopter reaches 1 meter off the bottom, the laser rangefinder and downward-looking digicam are added to the navigation answer. This precaution springs from pre-mission concern that the LRF and digicam may be obscured by mud kicked up by the copter blades. The IMU won’t output nice accuracy within the long-run, however as a result of Ingenuity takes solely a few seconds to succeed in 1 meter, “we will make it work,” Grip mentioned. Ingenuity then begins utilizing its full suite of sensors.

During hover flight, Ingenuity on its semi-autonomous personal makes an attempt to keep up a relentless altitude, heading and place. The JPL group has to depend on the copter’s estimates on how nicely it performs this activity, as there’s restricted to no foundation for floor reality. But the obtainable knowledge exhibits that Ingenuity holds its altitude extraordinarily nicely in hover, to inside roughly 1 centimeter, and its heading to inside lower than 1.5 levels. Horizontal place can differ as much as roughly 25 centimeters, which the group attributes to wind gusts on the Red Planet.


Because of the comparatively low accuracy of MEMS-based IMUs, navigation aids should certain the expansion in navigation errors because the copter cruises. The LRF offers vary measurements between the car and the terrain under, giving vertical velocity and place. With assistance from the MaVeN feature-tracking algorithm, the navigation digicam tracks visible options on the bottom, beneath the belief that every one options are situated on a floor airplane with a recognized slope. This offers horizontal velocity in addition to roll and pitch perspective, and helps restrict the drift in horizontal place and yaw angle.

However, the latter two measurements don’t have any absolute reference, and their estimates are topic to long-term drift. Therefore, shortly earlier than landing on the finish of every flight, a navigation digicam picture is saved for later transmission on Earth, in order that an absolute place and heading repair could be obtained by comparability to the recognized terrain.

“To develop the flight management algorithms,” Grip wrote in a NASA weblog put up updating Ingenuity’s followers, “we carried out detailed modeling and laptop simulation to be able to perceive how a helicopter would behave in a Martian setting. We adopted that up with testing in a large 25-meter-tall, 7.5-meter-diameter vacuum chamber right here at JPL, the place we replicate the Martian ambiance. But in all of that work, we might solely approximate sure points of the setting. Now that Ingenuity is definitely flying at Mars, we will start to evaluate how issues stack up in opposition to expectations.”

The MAVeN navigation algorithm used “has no absolute references to any landmarks,” in response to Grip. “It at all times operates in opposition to a base body the place it sees a bunch of options and tracks them over a restricted set of search frames. When it’s completed, it requires a very new base body. It is at all times monitoring in a relative sense, by no means tied again to a world body.

MAVeN is applied as an Extended Kalman Filter (EKF) that additionally makes use of the distinction between the expected and measured LRF vary. MAVeN has a state vector with seven parts: place, velocity, perspective, IMU accelerometer bias, IMU gyro bias, base picture place and base picture perspective, for a complete of 21 scalar parts.

MAVeN solely tracks options between the present search picture and the bottom picture. Because the bottom body is often reset as options are misplaced, MAVeN is successfully a long-baseline visible odometry algorithm: the relative place and perspective between the 2 photographs are measured, however not absolutely the place and perspective. Absolute place and perspective error, on this case horizontal place and yaw, develop over time. The LRF offers vertical place, which bounds vertical place error. In addition, the visible options and flat-plane assumption present observability of absolute pitch and roll when the car is transferring.

A key benefit of MAVeN over different simultaneous localization and mapping (SLAM) algorithms is that the state solely must be augmented with six scalar parts—three for place and three for perspective. The LRF and an assumed floor airplane allow MAVeN to estimate 3D place and velocity with out introducing a scale ambiguity.

The two principal disadvantages of MAVeN are sensitivity to tough terrain, because of the ground-plane assumption, and long-term drift in place and heading. For Ingenuity’s expertise demonstration part, that is an appropriate tradeoff, as a result of accuracy degradation is swish and the algorithm has confirmed to be extremely strong in each simulation and experiments.

Feature detection in base photographs is carried out with an implementation of the FAST algorithm [30], which selects corner-like options which have enough distinction between a middle pixel and a contiguous arc surrounding the middle pixel. An algorithm estimates the displacement of a template from one picture to the subsequent, utilizing a gradient-based search algorithm that minimizes the distinction in pixel depth (see Figure 3).


Landing is an altogether delicate matter.

A fast sequence of occasions takes place as Ingenuity descends towards the bottom. “First, a gentle descent price of 1 meter per second is established,” Grip wrote. “Once the car estimates that the legs are inside 1 meter of the bottom, the algorithms cease utilizing the navigation digicam and altimeter for estimation, counting on the IMU in the identical means as on takeoff. As with takeoff, this avoids mud obscuration, however it additionally serves one other objective: by relying solely on the IMU, we anticipate to have a really clean and steady estimate of our vertical velocity, which is essential to be able to keep away from detecting landing prematurely.

“About half a second after the change to IMU-only, when the legs are estimated to be inside 0.5 meters of the bottom, the landing detection is armed. Ingenuity will now contemplate landing to have occurred as quickly because the descent velocity drops by 25 centimeters per second or extra. Once Ingenuity meets the bottom, that drop in descent velocity occurs quickly. At that time, the flight management system stops making an attempt to regulate the movement of the helicopter and instructions the collective management to the bottom doable blade pitch to supply near zero thrust. The system then waits 3 seconds to make sure the helicopter has settled on the bottom earlier than spinning down the rotors.”

The downward-facing digicam takes a number of photographs on touchdown, which is factored into the sequence for subsequent takeoff.


Ingenuity’s deliberate technological demonstration was to final for 5 flights. Then, sadly, its pathbreaking life would come to an finish, its responsibility completed. Its mum or dad and experience to Mars, the four-wheeled Perseverance rover, would proceed for 2 extra years to discover the Jezero Crater, web site of a lake 3.9 billion years in the past, in search of traces of historical microbial life. Ingenuity would perch immobile without end upon the Martian panorama, the lonely one.

But wait.

“On the final flight, we really flew some place else,” Grip mentioned. “We had scouted that terrain beforehand with the helicopter.

“In that scouting flight, No. 4, we took photographs utilizing the high-resolution return-to-Earth coloration digicam. We might see on our goal airfield, particular person rocks, ripples, options, that we then georeferenced in opposition to a low-resolution satellite tv for pc picture, so we knew precisely the place these options had been in a world body. When we went again on flight 5, we might use these options to reference ourselves.”

Flight No. 5’s touchdown appeared nice, nearly as good because it might have been. Everything went in response to plan.

Then a momentous resolution was made in Pasadena, to ship Ingenuity additional—into an operational demonstration part, very totally different, at a decrease cadence for helicopter operations. As the Mars Project focuses now on rover Perseverance and the science it delivers, “We’re in a background function,” Grip mentioned, “doing flights each two to 3 weeks, to demo operational functionality, at larger threat, and centered extra on aerial imaging capabilities.

“These flights are stretching Ingenuity’s functionality when it comes to altitude, distance and pace. We’ve lined our fundamentals, proven {that a} helicopter can fly on Mars, properly and confidently. We’re now stretching the parameters of these flights with the {hardware} and software program that we’ve on the helicopter.”

The elevated pace over floor impacts the navigation system and the way the options the digicam is monitoring transfer by way of the sector of view. Additionally, new flights will break the parameter of flying over comparatively flat terrain. “We could fly over much less flat terrain, that can problem the navigation algorithm. How much less flat shouldn’t be factored in an specific means. We can take a look at the LRF knowledge after the very fact and analyze it, however it’s not being utilized in actual time to navigate the copter.”

“As we proceed with our flights on Mars,” Grip concluded, “we’ll preserve digging deeper into the information to grasp the assorted subtleties which will exist and could be helpful within the design of future aerial explorers. But what we will already say is: Ingenuity has met or exceeded our flight efficiency expectations.”