Today’s robots mostly rely on vision and touch to operate in their environments, but this could change with new research coming out of Carnegie Mellon University (CMU).
A group of researchers found that robot perception could improve through hearing.
Using hearing, robots could also determine what type of action caused a sound, and sounds could help predict the physical properties of new objects.
Pinto and the group of researchers found that the performance rate was high, specifically a 76 percent accuracy rate for robots using sound to successfully classify objects.
The study was conducted with the researchers creating a large dataset and simultaneously recording video and audio of 60 common objects.
Some of those objects were toy blocks, hand tools, apples, shoes, and tennis balls. The recordings took place as the objects rolled and crashed around a tray.
One of the new findings was that a robot could use its experience and what it learned about the sound of a certain set of objects in order to predict the physical properties of another set of unseen objects.
“I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” Pinto said.
For instance, a robot couldn’t use sound to tell the difference between a red block or a green block.
“But if it was a different object, such as a block versus a cup, it could figure that out.”
- Why IT Must Break Down Silos as Part of its Digital Transformation Initiative
- Digital Transformation: Card Innovation Can No Longer Wait
- AI Should Change What You Do — Not Just How You Do It
- How to Use E-Signatures to Achieve Digital Transformation
- Achieving customer-centric insurance digital transformation