Robots that sense the world through sound are more human-like
10-23-2024

Robots that sense the world through sound are more human-like

The audience gasps as sound effects vibrate the seats of a darkened theater. They shake the cup to hear the clinking ice to know how much is left.

Lost in thought, the moviegoers tap the armrest, contemplating if it’s real wood or plastic imitating the real thing. This knack for identifying objects and their makeup through sound is common practice.

Striving to augment the sensory abilities of robots, researchers are now replicating this human ability.

Sensing the world through sound

Next month, at the Conference on Robot Learning (CoRL 2024) in Munich, Germany, experts from Duke University will introduce the world to SonicSense. This system offers robots a perception capability similar to humans.

“We introduce SonicSense, a holistic design of hardware and software to enable rich robot object perception through in-hand acoustic vibration sensing,” wrote the researchers.

“While previous studies have shown promising results with acoustic sensing for object perception, current solutions are constrained to a handful of objects with simple geometries and homogeneous materials, single-finger sensing, and mixing training and testing on the same objects.”

Robots’ perception of sounds

Study lead author Jiaxun Liu is a first-year PhD student in the laboratory of Boyuan Chen, professor of mechanical engineering and materials science at Duke.

“Robots today mostly rely on vision to interpret the world,” explained Liu. “We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to ‘feel’ and understand the world.”

SonicSense is a robotic hand fitted with four fingers and each finger has a contact microphone at the tip. When the robot interacts with an object – tapping, holding, or shaking – the sensors capture the vibrations and tune out ambient noise.

Process of sonic tuning

The SonicSense system identifies objects and determines their shape based on the detected signals and frequency features.

The system relies on prior knowledge and leverages advancements in artificial intelligence to discern the object’s material and three-dimensional shape.

Liu and Chen’s laboratory demonstrations of SonicSense reveal its impressive capabilities. For instance, it estimates how many dice are inside a box or how much water is contained in a bottle by shaking the objects or tapping their surfaces.

Robot sensory perception

While SonicSense isn’t the first of its kind, it’s a unique setup. The four fingers touch-based microphones and advanced AI techniques set it apart from previous attempts on robots.

The integration allows SonicSense to identify objects of varying complexity. It is useful for use with materials challenging for vision-based systems and objects composed of multiple materials with reflective or transparent surfaces.

As SonicSense continues to evolve, researchers focus on enhancing its interaction with multiple objects. The future includes an upgraded robotic hand with advanced manipulation skills, paving the way for robots to perform tasks that need a subtle sense of touch.

SonicSense in various environments

An exciting avenue for SonicSense’s evolution is its use in varied environments outside the laboratory.

Researchers are exploring the potential for SonicSense to function in dynamic and unpredictable environments such as urban landscapes or disaster zones.

By fine-tuning its sensory capabilities, SonicSense could enable robots to navigate through debris, assess stability, or identify materials essential for rescue operations.

Incorporating external data sources and environmental adaptability algorithms is essential for advancement, allowing SonicSense to broaden the range of scenarios for beneficial robotic intervention.

Benefits of robots that use sound

Ethical consideration and societal impacts become paramount as SonicSense moves towards greater integration in autonomous robotic systems.

Robots discerning and interacting with the world using sound elevates questions regarding privacy, ownership of generated data, and potential biases intrinsic to AI systems.

As robots play larger roles in healthcare, retail, and domestic spaces, it’s critical to ensuring transparency in how SonicSense interprets sensory data.

The development process must incorporate diverse perspectives, aiming for inclusive technology that respects individual rights and maximizes the societal benefits of SonicSense’s capabilities.

The ability to bring robots closer to human-like adaptability is a testament to advances in robotics and artificial intelligence.

The technology’s potential to bridge the gap between contrived lab settings and real-world complexity could revolutionize how people think about robots and their role in the world.

The study is published in the journal ArXiv.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe