226 Views

ESA tests neuromorphic AI for Mars rover

LinkedIn Facebook X
October 28, 2024

Get a Price Quote

UK startup Opteran is working with Airbus Defence and Space to use neuromorphic AI for  a Mars rover project by the European Space Agency (ESA)

Airbus is testing the Opteran Mind general purpose neuromorphic software to provide better image systems for a Mars rover. The Opteran software allows image stabilisation and 360 degree field of view from just four standard cameras, reducing the weight and power consumption of the rover. The images can also provide localisation data.

Opteran is conducting tests with Airbus at its Mars Yard in Stevenage, UK, to enable rovers to understand depth perception in the toughest off-world environments.

The neuromorphic visual and perception systems give a rover the ability to understand their surroundings in milliseconds, in challenging conditions, without adding to the robots critical power consumption. The algorithm enables autonomous machines to efficiently move through the most challenging environments without the need for extensive data or training.

  • Mobile robots use insect inspired neuromorphic AI
  • Russia sanctions hit European Mars rover
  • Lifeline for Rosalind Franklin Mars rover

Successful application of this technology to real-world space exploration will significantly extend navigation capabilities in extreme off-world terrain, providing continuous navigation while being able to drive further and faster.

“This is a collaboration between Airbus and ESA to develop a Mars rover,” said Charlie Rance, Chief Product Officer at Opteran. “We have a grant project to help them understand the scene around them.” 

“The way we are operating is we are not using a high definition version but get more information for the visual algorithms so we can operate on a small ASIC embedded in the camera.”

The algorithm for localisation and perception occupies 1.4 cores in a quad core chip using ARM Cortex A53 cores, or 0.9 of a core for the Cortex A76.

“We create a flexible architecture with either distributed processing or a single unit, but  we haven’t got to that stage with ESA, the goal is to prove the algorithms,”  said Rance. “We’ve got the early test results which are looking promising and we are analysing the data until the end of 2024.”

The project is funded by ESA’s General Support Technology Programme (GSTP) through the UK Space Agency to develop depth estimation for obstacle detection, and the mid-term aim to provide  infrastructure-free visual navigation.

Once the results of the initial testing have been presented to ESA the goal would be to move to the next stage of grant funding in early 2025 which would start to focus on deployment and commercialization.

Recent Stories