Robots Gain Human-Like Touch Through Acoustic Vibrations Sensing

Researchers at Duke University have developed a system called SonicSense that enables robots to identify materials, understand shapes, and recognize objects by “listening” to vibrations, much like humans do with their sense of touch.

This technology allows robots to interact with their surroundings in ways previously limited to humans. Led by Professor Boyuan Chen and PhD student Jiaxun Liu, the team has created a robotic hand with four fingers, each equipped with a contact microphone that detects and records vibrations generated when the robot taps, grasps, or shakes an object.

By analyzing these vibrations, SonicSense can determine what material an object is made of and its 3D shape, even if it’s never seen before. This breakthrough has the potential to transform how robots perceive and interact with objects, and could have significant implications for industries such as manufacturing and healthcare.

Listening to the World: Robots Gain a Sense of Touch through Acoustic Vibrations

Robots are one step closer to mimicking human-like abilities with the development of SonicSense, a system that allows them to interact with their surroundings by “listening” to vibrations. This innovative approach enables robots to identify materials, understand shapes, and recognize objects in ways previously limited to humans.

The Power of Acoustic Vibrations

Humans have an innate ability to interpret the world through acoustic vibrations emanating from objects. For instance, we can determine how much soda is left in a cup by shaking it and listening to the rattling of ice inside. Similarly, we can tap on an armrest to determine if it’s made of real wood or plastic. Researchers at Duke University have successfully replicated this ability in robots using SonicSense.

SonicSense features a robotic hand with four fingers, each equipped with a contact microphone embedded in the fingertip. These sensors detect and record vibrations generated when the robot taps, grasps, or shakes an object. By tuning out ambient noises, the microphones allow the robot to focus on the object’s acoustic signature. This information is then used to extract frequency features, which are paired with advanced AI techniques to determine the material composition and 3D shape of the object.

Augmenting Robotic Sensing Abilities

SonicSense offers a significant advantage over traditional vision-based systems, which can be limited by factors such as lighting conditions or object transparency. By incorporating acoustic vibrations into their sensing repertoire, robots can gather more comprehensive information about their environment. This ability to “hear” and “feel” objects can transform how current robots perceive and interact with the world.

The researchers have demonstrated SonicSense’s capabilities in various scenarios, including counting dice in a box, determining the liquid level in a bottle, and reconstructing the 3D shape of an object by tapping around its exterior. These abilities make SonicSense a robust foundation for training robots to perceive objects in dynamic, unstructured environments.

Bridging the Gap between Controlled and Real-World Data

One of the significant challenges in developing robotic systems is bridging the gap between controlled lab settings and real-world data. SonicSense addresses this issue by enabling robots to interact directly with diverse, messy realities of the physical world. This approach allows for more realistic and comprehensive training datasets, which are essential for developing robust and adaptable robots.

Future Developments and Applications

The researchers are working to enhance SonicSense’s ability to interact with multiple objects in dynamic environments. By integrating object-tracking algorithms, robots will be able to handle cluttered scenarios, bringing them closer to human-like adaptability in real-world tasks. Additionally, the design of the robot hand itself is expected to evolve, incorporating advanced manipulation skills and integrating multiple sensory modalities, such as pressure and temperature.

The potential applications of SonicSense are vast, ranging from search and rescue operations to healthcare and manufacturing. As robots become more adept at interacting with their environment, they will be able to perform tasks that require a nuanced sense of touch, further blurring the lines between humans and machines.

More information
External Link: Click Here For More
Quantum News

Quantum News

As the Official Quantum Dog (or hound) by role is to dig out the latest nuggets of quantum goodness. There is so much happening right now in the field of technology, whether AI or the march of robots. But Quantum occupies a special space. Quite literally a special space. A Hilbert space infact, haha! Here I try to provide some of the news that might be considered breaking news in the Quantum Computing space.

Latest Posts by Quantum News:

Toyota & ORCA Achieve 80% Compute Time Reduction Using Quantum Reservoir Computing

Toyota & ORCA Achieve 80% Compute Time Reduction Using Quantum Reservoir Computing

January 14, 2026
GlobalFoundries Acquires Synopsys’ Processor IP to Accelerate Physical AI

GlobalFoundries Acquires Synopsys’ Processor IP to Accelerate Physical AI

January 14, 2026
Fujitsu & Toyota Systems Accelerate Automotive Design 20x with Quantum-Inspired AI

Fujitsu & Toyota Systems Accelerate Automotive Design 20x with Quantum-Inspired AI

January 14, 2026