Cognitive Hearing Aid Can Isolate a Single Voice in a Crowded Room |
Ask anyone who uses a modern hearing aid and they'll likely tell you that the technology is just this side of miraculous — except in crowds. In busy multi-speaker environments, even the most advanced hearing aids have trouble helping people hear what others are saying.
New research out of Columbia University, however, aims to remedy that situation by combining traditional hearing aid technology with brain scanning and artificial intelligence. Using external sensors that monitor brain activity, the technology determines which person the user is speaking to, isolates the source, and then amplifies that person's voice while suppressing all other background noise.
“We originally decoded a person's attention using invasive recordings in 2012, and we showed in 2014 that the same can be done using non-invasive scalp recording — a cap with electrodes that touch the scalp,” said Nima Mesgarani, an associate professor in the Department of Electrical Engineering at Columbia University in New York.
Processing massive amounts of audio data and isolating the wearer's object of interest based on their brain waves is a heavy lift for a hearing aid system. Performing that high-tech maneuver in real time requires a lot of computing power, which can be difficult to scale down to the size of a standard hearing aid. Luckily, advances in material science are making such tiny computers possible.
“Plenty of research is being done these days to make small dedicated chips that can do the computation that is needed in such devices,” Mesgarani said..
The technology, called auditory attention decoding, relies on the deep neural network model of artificial intelligence. Neural networks mimic the workings of the human brain – learning on the fly and coming up with their own solutions to new problems. Using the neural network approach, the hearing aid computer actually teaches itself, over time, the best way to pluck a single voice out of a crowded room.
“Our algorithm uses deep neural networks, because they are becoming so widespread, many researchers are developing low-power specialized hardware to implement them in real time,” Mesgarani said. “Also, the modern hearing aids are able to do some of their calculation off-board – for example by syncing to your phone – which helps to manage heavy computation in such small device.”
The technology is in very early proof-of concept phase, but Mesgarani said that if all goes well the system could start showing up in commercial hearing aids within five years.
“There is no theoretical reason prohibiting the implementation of this technology in an actual hearing aid,” he said. “In fact, several hearing aid companies have already started researching this idea and expressed interest in our approach.”
No comments:
Post a Comment
Thanks you for your concern towards this blog, share your view about this post by clicking on comment. Have a nice day ahead.
Place your Advert for as low as $10 call 08061154825