Your Nvidia GPU could one day make your AirPod calls sound better

Your Nvidia GPU could one day make your AirPod calls sound better

Another step forward for Nvidia’s campaign to prove the value of AI.

A small team of University of Washington computer science and engineering students has created a brand-new pair of wireless earphones that employ artificial intelligence (AI) technology to improve background noise reduction during phone calls.

The earbuds, which the students have named “ClearBuds,” are designed to monitor two distinct audio streams from the left and right buds’ microphones to form a spatial audio soundscape. The data is then sent to a phone, which uses two neural networks to remove background noise.

The first neural network suppresses any sound that isn’t the caller’s voice, while the second amplifies and improves the clarity of that voice. The result is a significant improvement in the removal of background noise, which the students found to be 6.94 dB better than the noise cancellation of the Apple AirPods Pro.

The AI deep learning capabilities of Titan desktop GPUs from Nvidia were used to train both neural networks over the period of one day. Nvidia recently established a new program for businesses to start integrating AI software at the enterprise level. Nvidia has been quick to underline the potential of AI for improving gaming via DLSS, as well as how we utilize technology in many areas of life. Even AMD is participating, despite prior statements to the contrary.

Analysis: Awesome tech that will need an industry shift to become useful

At this time, the majority of wireless earbuds feature active noise cancellation (ANC), which eliminates external noise by monitoring it with a microphone and producing an inverted sound wave signature. Some of them are rather good, but this new technology might make ANC performance significantly better.

But there are a few wrinkles that must be worked out first, and they have nothing to do with the ClearBuds (again, fantastic name). The first is that the neural network needs spatial audio to function properly, which means that the user must wear both earphones for complete operation. We aren’t the only ones who sometimes forget to put in both earbuds when we’re out and about, though.

Consumer earbuds now use ANC by monitoring audio from one earbud, despite the fact that most have microphones in both buds, which the UoW students found to be a more pressing issue in their post-project research (because of the propensity of users to wear just one).

This means that manufacturers would need to change the way their products function in order to integrate this AI-based ANC, and the technology might also lead to reduced battery life for both the connected phone and the earbuds, which power the neural network process itself. Personally, we’re hoping that these students can find a business partner because ClearBuds sound like the upcoming big thing with a name like that.

Leave a Reply

Your email address will not be published. Required fields are marked *