After several varyingly-successful attempts at a commercial landing on the Moon over the past few years, both the space industry and global onlookers were happy to watch Firefly Aerospace’s Blue Ghost lander succeed in safely reaching the lunar surface and complete its mission without incident.
One Canadian company, NGC Aerospace (NGC), had their own optical navigation system along for the ride as part of a demonstration funded by the Canadian Space Agency’s (CSA) Lunar Exploration Accelerator Program (LEAP). Their optical navigation system, the Crater-Based AbsNAV, uses imagery of lunar craters to determine the position of the lander and help its guidance system reach the lunar surface safely.
The demonstration was hailed as a success by NGC in their recent release on the Blue Ghost mission, and SpaceQ reached out to NGC about their Crater-Based AbsNAV and about their contribution. NGC President Jean de Lafontaine responded with information on the mission.
While this Blue Ghost run was just a demonstration, the technology could help ensure that future lunar missions land safely, just as the Blue Ghost did.
Navigation, guidance and control
The letters in the company’s name, NGC, are actually an acronym, standing in for “navigation, guidance, and control.” Lafontaine said it was a reference to “the three basic functions that render an aerospace vehicle autonomous and agile,” and describes the core of their work. Specifically, he said, the company designs, develops, validates and delivers software related to both Attitude and Orbit Control Systems (AOCS) and Guidance, Navigation and Control (GNC)—the ones that “determine and control the position and orientation of space vehicles in order to point payloads to their targets with high autonomy, high agility and high accuracy.” There are currently seven ESA satellites being controlled by NGC software.
A corporate focus that’s likely related to the lunar demonstration is their work with ESA on the PROBA program. PROBA (PRoject for On-Board Autonomy) was a program that began in the early 2000s focused on building the first fully autonomous spacecraft within the ESA-Canada context. NGC, Lafontaine explained, developed and delivered key software for that project. The software allowed for spacecraft to make on-board navigational, orbital and attitudinal computations, instead of sending information down to Earth and waiting for terrestrial guidance commands.
Lafontaine said that they also developed an automated process to generate the on-board code and validate it using simulations, “reducing the possibility of human errors and the cost of software development.”
Crater-based AbsNAV
AbsNAV is short for “absolute navigation,” referring to navigation where you determine the precise longitudinal and latitudinal positioning of an object, as opposed to its relative position to some other nearby object. Global Navigation Satellite Systems, like GPS, are an example of ways to determine these positions, but other solutions are needed when a GNSS solution isn’t available.
One solution, feature-based AbsNAV systems already exist, which determines an object’s position using other nearby objects. Lafontaine explained that, in feature-based AbsNAV, the software “reconstructs in real time the topography of the scene the camera should observe from an initial knowledge of where it is, and makes the correlation with what the camera actually sees.” The problem, he said, was that this approach can be “quite demanding in computer memory and computational cost,” especially for the autonomous capabilities of something like a lunar lander.
In their crater-based approach, things are quite a bit less computationally demanding. Lafontaine said that the inspiration for it was autonomous star trackers used to determine satellite orientations, which compare an on-board star map to the stars that a satellite’s sensors are seeing, and can determine the position based on the variance. Their crater-based approach is roughly similar: the crater-based AbsNAV has “an extensive database of crater data in the form of the 5 parameters of the ellipse that best approximate the shape of the crater,” and compares those ellipses to ones generated from the observations of the spacecraft’s cameras. Craters that appear on the cameras are converted into appropriate ellipses, and searched in the database for correlations.
Lafontaine said that one advantage is that the reference imagery for the craters is readily available, as the lunar surface “has already been imaged and mapped by several Lunar exploration missions,” including NASA’s publicly available Planetary Data System. “Moon scientists have already built crater databases using this information,” he added, though “we did have to augment/adapt those databases for our application.”
Once matches are made, he said, NGC’s software uses (quite a lot of) math to determine that “if the camera took the picture of these craters appearing in this configuration and from this point of view, the camera must be at this position in lunar coordinates.” While this still requires a lot of computation, it requires far less than the feature-based approach, and can yield similar levels of accuracy without the need for external determination and validation of a lander’s position.

LEAP, Firefly, and NGC’s successful test
Lafontaine said that they reached out to Firefly Aerospace about being part of the Blue Ghost mission, and were chosen to be part of Firefly Aerospace’s in 2020. He said that “Firefly recognized the opportunity to compare our approach with theirs and learn how each method performs in an operational scenario.”
In order to develop and test this technology, NGC also received a Capacity Development contribution agreement with the CSA, worth $726,249 as part of the CSA’s Lunar Exploration Accelerator Program. In addition to Firefly and the CSA, Lafontaine also mentioned Space-NG as a partner in the demonstration. Space-NG provided “the baseline visual navigation solution for the Blue Ghost mission,” who worked with NGC to integrate the system on-board the Lander.
The Blue Ghost mission was launched on January 15th, and the descent and landing on the Moon happened on March 2nd.
As noted in NGC’s release, the experiment was a success. The system was provided imagery acquired during descent, and then used that imagery to compute the lander’s position. According to NGC, their analysis of the results “shows that the system successfully provided position measurements with an accuracy of 100 m or better using lander imagery from altitudes ranging between 50 km and 20 km,” thanks to imagery of “approximately 25 craters on average in the individual images during the descent trajectory.”
Lafontaine said that, if anything, the demonstration was surprisingly successful. “One good surprise,” he said, “was how the actual system performance in flight closely matched the simulation results.” Despite this being their first involvement in an actual lunar descent, the system “actually behaved very close to what we observed in simulation.” Lafontaine added that “this showed that we had the proper simulation environment and adequately modeled the Moon environment and its uncertainties.”
This may mean that the crater-based solution might end up being even more accurate than originally expected. While flight demonstration results show that the 100m accuracy mentioned in the release were achievable from altitudes similar to a Moon descent orbit, Lafontaine said that their simulations showed that it was possible that “performance actually improves at lower altitudes.” He said that “this was not the objective of the demonstration with Firefly,” but the accuracy of the simulations to the real-world experience may mean that the enhanced performance seen in the simulations might be a possibility.
NGC will soon have the opportunity for further testing. Lafontaine said that “we have an agreement with ispace to access imagery from their M2 mission to do additional demonstration of our system.” ispace’s Hakuto-R Mission 2 is currently orbiting the Moon, and is scheduled for a landing attempt no earlier than June 6th. He said that “we intend to use this data to further demonstrate and validate our technology under conditions different from the Blue Ghost descent orbit conditions.”
He said that they also plan on having “more significant involvement in upcoming Moon landing missions.” While he couldn’t provide details, he did add that “discussions are on-going with Moon landing service providers in that respect.”