In the ocean sciences, robots provide views of the unexplored and can navigate environments not safely accessible to humans. Such dangerous settings make up the majority of Earth’s oceans.
These robots come in all kinds of shapes, sizes and dimensions, and are outfitted with various sensors and cameras used for capturing images, measuring structures, and figuring out where the robot is underwater.
Northeastern researchers Alan Papalia and David Rosen have developed an algorithm that improves the accuracy of a technology often used in underwater robotics—acoustic navigation, which uses acoustic sensors that emit, detect, and analyze sound waves to help robots understand their positioning in the water.
The paper is published in the journal IEEE Transactions on Robotics.
“You can use these acoustic sensors in a variety of ways: you can attach them to different robots and the robots can measure how far away they are from each other; or you can fix them to the environment (e.g., the seafloor) and measure how far away the robot is from the fixed sensors,” the researchers write in a summary of the report.
“In a lot of ways, this is similar to how GPS works, which just estimates your position by measuring how far away you are from a set of satellites.”
While acoustic sensors are generally more accessible and cheaper to obtain than more expensive sensor systems, they are not as reliable in actually determining where a robot is at any given time in the water, explains Papalia, a Northeastern University postdoctoral research associate in the department of electrical and computer engineering and the lead author of the research.
One of the key issues with acoustic sensors is that “they don’t precisely tell you where you are; they just tell you how far away you are from another point,” Papalia explains.
“This means that even if the sensors are perfect (which they certainly are not), you only know that you are somewhere on a circle a fixed distance away,” he adds.
“This ambiguity is a major problem for navigation, because it means that estimating where you are depends heavily on already having a good initial guess of where you are in the first place.”
This new open-source algorithm improves their reliability significantly, the researchers say, potentially allowing researchers in the future to invest in navigation systems that cost around $10,000 rather than $500,000. The researchers say the algorithm removes the ambiguity of figuring out where a robot is in the water, providing a guarantee that the algorithm’s estimates are correct.
“The punchline I usually give people is that we want to use cheaper sensors, but they aren’t as reliable, so we make reliable algorithms so the sensors don’t need to be,” Papalia says.
The target audience for this research is not solely roboticists, Papalia explains, but also researchers in other fields who can harness the data these robots collect.
One major application is in climate change research, he says, noting the work of his postdoc advisor and Northeastern professor Hanu Singh. Singh has traveled to the Arctic many times and used robots to help researchers measure melting glaciers.
“Hanu has done a lot of work trying to put robots under the ice and that really matters because we don’t fully understand the mechanics of the way sea ice is melting, and about half of sea rise comes from ice melt,” Papalia explains.
“The reason we don’t fully understand this and there is a lot of uncertainty in how we model, for example, how the ice in the world is changing, is because we can’t measure them directly. It’s too hard, and we’d like to put robots there.”
The researchers tested their algorithm using two autonomous surface vehicles on the Charles River, but they note that it can also be used in robotic systems designed for the ground and the air.
“That’s what we’re really trying to address,” Papalia adds. “We want to be able to just hand reliable robots to scientists, so they can use them, be aggressive in their scientific agendas, and really study the things that matter.”
More information:
Alan Papalia et al, Certifiably Correct Range-Aided SLAM, IEEE Transactions on Robotics (2024). DOI: 10.1109/TRO.2024.3454430
Northeastern University
This story is republished courtesy of Northeastern Global News news.northeastern.edu.
Citation:
Algorithm improves acoustic sensor accuracy for cheaper underwater robotics (2025, May 28)
retrieved 28 May 2025
from https://techxplore.com/news/2025-05-algorithm-acoustic-sensor-accuracy-cheaper.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.