Today’s robots mostly rely on vision and touch to operate in their environments, but this could change with new research coming out of Carnegie Mellon University (CMU).
A group of researchers found that robot perception could improve through hearing.
Using hearing, robots could also determine what type of action caused a sound, and sounds could help predict the physical properties of new objects.
Pinto and the group of researchers found that the performance rate was high, specifically a 76 percent accuracy rate for robots using sound to successfully classify objects.
The study was conducted with the researchers creating a large dataset and simultaneously recording video and audio of 60 common objects.
Some of those objects were toy blocks, hand tools, apples, shoes, and tennis balls. The recordings took place as the objects rolled and crashed around a tray.
One of the new findings was that a robot could use its experience and what it learned about the sound of a certain set of objects in order to predict the physical properties of another set of unseen objects.
“I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” Pinto said.
For instance, a robot couldn’t use sound to tell the difference between a red block or a green block.
“But if it was a different object, such as a block versus a cup, it could figure that out.”
- Marketing Is Turning to AI for Customer Acquisition
- Business leaders discuss why it is important to pick the right technology for your business – rather than jump on the next hype
- How AI Accidentally Learned Ecology by Playing StarCraft
- 3 Ways Machine Learning Can Transform the Healthcare Industry
- 7 Digital Transformation Trends To Follow