Bats and Dolphins Inspire a New Single-Sensor 3D-Positional Microphone for Human-Robot Interaction

Spinning tube delivers what would normally take a multi-microphone array, and could help improve robot operations in industry and more.

Researchers from Seoul National University's College of Engineering have announced the development of what they say is the world's first 3D microphone to be built around a single sensor — yet capable of estimating the position of a sound's source like a multi-sensor array.

"Previously, determining positions using sound required multiple sensors or complex calculations," explains lead author Semin Ahn, a doctoral candidate at the university. "Developing a 3D sensor capable of accurately locating sound sources with just a rotating single microphone opens new avenues in acoustic sensing technology."

Researchers have developed a single-sensor microphone capable of locating sounds in 3D space, thanks to a clever spinning tube. (📷: Ahn et al)

The sensing system is inspired by the way bats and dolphins use echolocation to determine there whereabouts of objects and the sources of sound, in which they "see space with ears." Dubbed "3D acoustic perception technology," the three-dimensional acoustic ranging (3DAR) system uses a single microphone sensor positioned in a hollow tube cut with rectangular slots — serving, the researchers explain, as a hardware-based phase cancellation mechanism. By rotating the microphone and processing the incoming data, it's possible to locate the source of sound in 3D space.

The team's work goes beyond just locating a sound, though: the researchers have demonstrated how the 3DAR system can also be used to implement a sound-based human-robot interaction system capable of operating even in noisy environments — which, they say, could be applied to everything from industrial robotics, where it can provide real-time tracking of user position, and search-and-rescue operations.

The researchers are hoping the microphone system can be used to improve safety, even in noisy environments. (📷: Seoul National University College of Engineering)

In real-world testing on a quadrupedal robot platform, the system showed over a 90 percent accuracy in human-robot interaction tasks and a 99 percent accuracy for robot-robot interaction tasks. For multiple sound sources, the tracking accuracy reached 94 percent — even in noisy environments, the researchers say.

The team's work has been published in the journal Robotics and Computer-Integrated Manufacturing under closed-access terms.

Main article image courtesy of the Seoul National University College of Engineering.

ghalfacree

Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.

Latest Articles