Washington : Researchers have found a unique way to listen in other people’s conversation by watching the video of objects like potato chip bags, reports ANI.
Researchers at MIT, Microsoft, and Adobe have developed an algorithm that can reconstruct an audio signal by analyzing minute vibrations of objects depicted in video.
In one set of experiments, they were able to recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.
In other experiments, they extracted useful audio signals from videos of aluminium foil, the surface of a glass of water, and even the leaves of a potted plant.
Abe Davis, a graduate student in electrical engineering and computer science at MIT said that when sound hits an object, it causes the object to vibrate and the motion of this vibration creates a very subtle visual signal that’s usually invisible to the naked eye.
He further explained that they are recovering sounds from objects and that gives them a lot of information about the sound that’s going on around the object, but it also gives us a lot of information about the object itself, because different objects are going to respond to sound in different ways.