My Wishlist for How AI Can Be Incorporated into Hearing Aids: Things We Want but Aren’t There Yet

by | May 23, 2025 | AI, Hearing Aids, Patient Resources

As incredible as hearing aids have become, there’s still much room for growth – especially with the integration of artificial intelligence (AI). We’ve all encountered situations where understanding speech is a challenge, even with the best technology available.

Envision a world where you can travel globally without communication barriers, engaging fully and confidently in conversations, no matter what language is spoken, or enjoy daily interactions without feeling exhausted by the end of listening to them.

This kind of intuitive technology could alleviate many frustrations by offering a truly customized auditory experience. You, like many, may feel the frustration of dealing with the limitations of current technology.

I’ve been thinking about what could be next for hearing aids and AI, and while we’ve seen some impressive advancements, there are some things I’d love to see implemented that haven’t come around just yet.

My Choices for Hearing Aid and AI Combinations

I hate to be a Debbie Downer, but I am not sold on AI or deep neural networks in their current form as being a real game changer for hearing aids.

Yes, it is better to have AI incorporated into hearing aids than not. That’s a no-brainer.

Multiple leading hearing aid manufacturers’ latest technological offerings, including the Oticon Intent, Phonak Infinio Sphere, ReSound Vivia, and Starkey Edge AI, all have integrated AI chips.

All of them tout how many sound/speech samples their chips are trained on, and they all rave about how their AI can separate speech (basically what you want to hear) from noise (basically anything you don’t want to hear).

Sounds great, doesn’t it? But AI hearing aids will only accentuate speech and not separate out the noise you don’t want to hear!

I firmly believe that AI can/does/will help hearing aids to learn speech from non-speech sounds. This is definitely a plus.

However, prior to AI incorporation into hearing aids, hearing aids already did a decent (but not spectacular) job of doing this. AI will make this better, but this is not the real problem current hearing aid wearers have.

To help illustrate the major flaw in the overzealousness of promoting AI in hearing aids, I will quote a popular phrase, “One man’s music is another man’s noise.”

How will AI differentiate/separate out the “noise” when what you want to hear is speech and what you don’t want to hear is also speech?

Will AI hearing aids know the speech you want to hear vs. the speech you don’t? Of course it won’t.

This conundrum has been and will continue to be the #1 issue for hearing aid wearers. There is no way for AI to know what speech the user wants versus the speech the user doesn’t want at that moment. The hearing aids cannot read your mind or know your intentions.

One rebuttal I heard was that AI will allow hearing aids to learn your spouse’s voice and then always accentuate it over everyone else. This, in theory, sounds great, but I can instantly think of multiple scenarios where this would be horrible.

For example, imagine you and your spouse go over to your neighbor’s house to chat. You’re talking to one neighbor, and your spouse is talking to another. But instead of hearing your neighbor, all you hear is your spouse!

If I took a few moments to speculate how this could be done, here’s how I’d frame it. For starters, the hearing aids need to know who/what you want to hear. How can this be done?

I would create a standard passage that can be read/spoken into a smartphone. Something reminiscent of the pangram, “The quick brown fox jumps over the lazy dog.”

The hearing aid user can have up to 3-5 stored voices at a time. At the beginning of the dinner, hand your phone to each of your dinner participants. Have them read the pangram, and the AI creates a voice recognition system and is told to only amplify those voices that have been stored in the phone. BOOM, problem solved. Of course, when dinner is done, the hearing aid wearer needs to tell the phone to stop capturing the voices.

This would work, but we’re not there – yet.

A Call for Innovation

At Visalia Hearing Center, we’re excited about these possibilities and committed to being part of the conversation, ensuring that your voice is always heard.

As we look into the future, the potential of AI in hearing aids represents not just an incremental step but a giant leap toward enhancing life quality for millions. While there are challenges and uncertainties, the aspirational journey from what’s possible today to what could be tomorrow is worth pursuing.

Advanced Hearing Aid Technology

Do you know somebody that needs to see this? Why not share it?

Dr. Dan Finnegan

Dr. Dan Finnegan, or Dr. Dan as most of his patients affectionately call him, was born in Modesto, CA and was raised in the small farming community of Hughson, CA. He obtained his Bachelor’s Degree in 2004 from UC Santa Barbara where he graduated cum laude, and his Doctorate in Audiology (Au.D.) from the San Diego State/UC San Diego Joint Doctoral program in 2009. He completed his externship/residency year at Shohet Ear Associates, a prestigious private practice ENT office in Newport Beach, CA. Upon completion of his academic training, he joined as a staff Audiologist at Tustin Hearing Center in Tustin, CA. During his eleven-year tenure at Tustin Hearing Center, Dr. Finnegan was recognized for his commitment to excellence in Audiology and his exceptional patient care. He received three Provider of Distinction Awards and was promoted twice–first in 2014 to Senior Audiologist and then in 2020 to Director of Audiology. It was also during his time at Tustin Hearing Center that he met the love of his life, Cassandra. They were married in 2013 and have three children together, two daughters and a son. In addition to spending time with his family, Dr. Finnegan bleeds the Green & Gold of the Oakland Athletics, enjoys playing cornhole, doing BeachBody on Demand and grilling a nice steak, preferably Tri-Tip.

    Request a Callback

    "*" indicates required fields

    Name*
    This field is for validation purposes and should be left unchanged.

    Categories