UK markets close in 53 minutes
  • FTSE 100

    +5.06 (+0.07%)
  • FTSE 250

    +311.97 (+1.57%)
  • AIM

    +2.30 (+0.25%)

    +0.0022 (+0.18%)

    +0.0173 (+1.43%)

    +606.56 (+3.21%)
  • CMC Crypto 200

    +30.80 (+5.80%)
  • S&P 500

    +70.88 (+1.72%)
  • DOW

    +499.65 (+1.52%)

    -2.13 (-2.35%)

    +1.90 (+0.10%)
  • NIKKEI 225

    -180.63 (-0.65%)

    -392.60 (-1.96%)
  • DAX

    +102.14 (+0.75%)
  • CAC 40

    +25.80 (+0.40%)

Scientists invent wireless earbuds that use AI to reduce background noise

·2-min read

Scientists have developed wireless earbuds that use AI for background noise cancellation in real time, an advance that may find a range of applications, including in smart-home speakers and tracking robots.

The earbuds, named “ClearBuds”, were described in a presentation at the Association for Computing Machinery (ACM) International Conference on Mobile Systems, Applications, and Services last month.

Researchers from the University of Washington in the US used a novel microphone system and one of the first AI systems of the kind to operate in real-time and run on a smartphone to enhance the speaker’s voice and reduce background noise.

“ClearBuds differentiate themselves from other wireless earbuds in two key ways,” study co-lead author Maruchi Kim, a doctoral student in the Paul G Allen School of Computer Science & Engineering, said.

First, he explained that the novel earbuds use a dual microphone array – one in which each earbud creates two synchronised audio streams that provide information and allow spatial separation of sounds coming from different directions with higher resolution.

Second, researchers say the lightweight AI system further enhances the speaker’s voice.

Although most commercial earbuds have microphones on each earbud, they say only one of them is actively sending audio to a phone at a time.

On the other hand, in ClearBuds, scientists say, each earbud sends a stream of audio to the phone.

Researchers developed bluetooth networking protocols for the earbuds to allow the two streams to be synchronised within 70 microseconds of each other.

The scientists’ AI algorithm runs on the phone to process the audio streams.

They say it suppresses any non-voice sounds, and then isolates and enhances the speaker’s voice that’s coming in at the same time from both earbuds.

“Because the speaker’s voice is close by and approximately equidistant from the two earbuds, the neural network can be trained to focus on just their speech and eliminate background sounds, including other voices,” Ishan Chatterjee, a co-author of the study, said.

“This method is quite similar to how your own ears work. They use the time difference between sounds coming to your left and right ears to determine from which direction a sound came from,” Mr Chatterjee explained.

Researchers also tested the earbuds by recording eight people reading from Project Gutenberg in noisy environments, such as a coffee shop or on a busy street.

In the test, scientists had 37 people rate 10- to 60-second clips of these recordings.

Clips processed through ClearBuds’ AI were rated by participants as having the best noise suppression and the best overall listening experience, according to the study.

Citing one of the limitations of the earbuds, scientists said people would have to wear both the buds to get the noise suppression experience.

They say the real time communication system developed in the study could find other applications, including smart-home speakers, tracking robot locations, or search and rescue missions.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting