Giving robots power to hear, work more like humans can be a game changer

Giving robots power to hear, work more like humans can be a game changer

People rarely use just one sense to understand the world, but robots usually only rely on vision and, increasingly, touch

IANSUpdated: Sunday, August 16, 2020, 08:45 PM IST
article-image
Photo: Pixabay

New York: Giving robots, who currently rely on vision and touch to move around, power to hear sounds and predict the physical properties of objects around them can be a game changer, say researchers including two of Indian origin.

People rarely use just one sense to understand the world, but robots usually only rely on vision and, increasingly, touch. The researchers from Carnegie Mellon University (CMU) now say that robot perception could improve markedly by adding another sense: hearing.

“A lot of preliminary work in other fields indicated that sound could be useful, but it wasn’t clear how useful it would be in robotics,” said Lerrel Pinto, who recently earned his PhD in robotics at CMU and will join the faculty of New York University this fall.

He and his colleagues found the performance rate was quite high, with robots that used sound successfully classifying objects 76 per cent of the time.

The team at CMU’s Robotics Institute found that sounds could help a robot differentiate between objects, such as a metal screwdriver and a metal wrench.

Hearing also could help robots determine what type of action caused a sound and help them use sounds to predict the physical properties of new objects.

Pinto said that the results were so encouraging that it might prove useful to equip future robots with instrumented canes, enabling them to tap on objects they want to identify.

To perform their study, the researchers created a large dataset, simultaneously recording video and audio of 60 common objects – such as toy blocks, hand tools, shoes, apples and tennis balls – as they slid or rolled around a tray and crashed into its sides. They have since released this dataset, cataloging 15,000 interactions, for use by other researchers.

The team captured these interactions using an experimental apparatus they called Tilt-Bot – a square tray attached to the arm of a Sawyer robot.

RECENT STORIES

Solar Eclipse 2024: Here's When & Where You Can Watch Surya Grahan Today

Solar Eclipse 2024: Here's When & Where You Can Watch Surya Grahan Today

The Science Of Sustainability: Himanshu Lamba's Trailblazing Work In Biotechnology

The Science Of Sustainability: Himanshu Lamba's Trailblazing Work In Biotechnology

Mumbai: St Joseph's High School Holds Viksit Bharat-Themed Exhibition On Science Day

Mumbai: St Joseph's High School Holds  Viksit Bharat-Themed Exhibition On Science Day

Sir Chandrasekhara Venkata Raman: A Man of Science Known For His 'Legacy'

Sir Chandrasekhara Venkata Raman: A Man of Science Known For His 'Legacy'

National Science Day 2024: All You Need To Know About Its Significance, Theme For This Year

National Science Day 2024: All You Need To Know About Its Significance, Theme For This Year