Researchers from Carnegie Mellon University appeared to have solved the age-old problem of microphone bleed-through, using just a couple of cheap video cameras. Instead of recording sound using microphones, the cameras see the vibrations created by individual instruments and then translate that back into sound waves.
No matter how good microphones are, there are always going to be drawbacks when recording multiple instruments. In order to be able to edit the sound in post-production of just one instrument, they first have to be isolated. That’s why recording studios are set up usually with more than one room, or the instrument tracks are recorded separately and layered one on top of the other.
But recording music that way takes a considerable amount of time, equipment, and money, and it also isn’t the most effective way to get the best musical performance from musicians who are more used to playing live together in a grungey bar than separately in a sterile studio environment.
What if you were able to record all the musicians performing together but you were also able to retain the ability to mix each instrumental track separately without bleed through?
Well, that’s where the researchers at Carnegie Mellon University’s School of Computer Science’s Robotics Institute turned to video cameras instead. If you pluck the string of a guitar or violin, you don’t just create sound waves, you can actually see the vibration. With the right equipment, those vibrations can be visualized and analyzed to recreate the sounds being produced, even if no sounds are recorded.
Optical microphones are not a new idea, but what the CMU researchers have come up with, and shared in a recently published paper, ‘Dual-Shutter Optical Vibration Sensing,’ is a way to make them work using low-end, and more affordable, camera gear.
The new system shines a bright laser light source on a vibrating surface, like a guitar’s body, and captures the movements of the resulting speckled pattern of light. The range of human hearing can detect sounds oscillating as fast as 20,000 times every second, so optical microphones have typically relied on expensive high-speed cameras to capture physical vibrations oscillating just as quickly.
However, the team at CMU have got around that issue by using 2 types of camera at the same time. One with a global shutter that captures entire frames of video, resulting in distinct speckled patterns, and one with a rolling shutter that captures frames line-by-line from the top to the bottom of the sensor, resulting in distorted speckled patterns that actually contain more information about how they move back and forth over time.
The frames from each camera are then compared with each other using a special algorithm up to 63,000 times a second. That’s as fast as a super expensive optical microphone but at a fraction of the cost.
The audio quality isn’t yet as high as a traditional microphone would give, however it could give sound engineers and producers more control when recording live music or very large ensembles such as symphony orchestras.
The system also has other potentially interesting applications outside of music. A video camera monitoring all the machines on a factory floor, or pointed at the engine of a running car, could determine when individual parts or components are making an abnormal sound, indicating maintenance may be required before a problem actually becomes a problem.