April 5, 2002

Research explores how we hear, how to hear better

Featured Image

Dr. Russell Ries represents the facial plastic and reconstruction subspecialty of Otolaryngology. He operates on faces as varied as the ones populating his walls, his goal the art of realism. (photo by Anne Rayner Pollo)

Research explores how we hear, how to hear better

Technology has leapt forward in the past 10 years, says Wesley Grantham, Ph.D., professor of Hearing and Speech Sciences and director of research for the department. Research projects under his wing span the gamut, from basic to applied.

Troy Hackett’s research is at the most basic level of hearing and language development. Hackett, Ph.D., research assistant professor in Hearing and Speech Sciences and Psychology, is studying hearing processing centers of the brain in animal models, hoping to gain understanding into humans hearing.

The range of human hearing is between 20 and 20,000 hertz, Hackett says. Dogs, cats and most non-human primates have a range from 20 to 40,000; bats go up to 100,000. Guinea pigs and chinchillas, Hackett says, have hearing closest to humans.

Auditory pathways—stimulation from the cochlea to the brain stem to the cerebral cortex, or auditory zones—seems to be similar among mammals, Hackett says. Sounds are processed in the brain’s auditory zones; the more auditory zones, the more processing capability. Opossums have only one to two areas (hence the high road kill population), cats have seven to eight; monkeys have about 12.

But little is known about auditory zones in the human brain. Answers Hackett is trying to answer include: what is the normal organization of the auditory zones, what’s the circuitry, what’s the stimulus, and how is it determined that a sound is a friendly call or a warning? Functional MRI and other imaging techniques are helping find the answers.

One thing scientists do know today is that dormant neurons can be stirred. “The brain is capable of reorganizing,” he says.

Hackett and Dr. Robert Labadie, assistant professor of Otolaryngology, are about to begin a study to find out if human hair cells, in the cochlea, can be restimulated into working. “Hair cells in reptiles regenerate,” he says. “What’s keeping that from happening in humans?”

Todd Rickets, Ph.D., assistant professor of Hearing and Speech Sciences, develops new methods for signal processing and tests products in development and on the market. Digital signal processing and directional microphones on hearing aids have been the latest developments.

Old hearing aids would simply amplify all sounds, so in order to make soft noises detectable loud noises would be too loud. Advanced compression and digital signal processing circuitry can help reduce this problem and also reduce noise in certain frequencies while enhancing speech at other frequencies.

A main complaint of people who use hearing aids, Ricketts says, is that they have trouble understanding a speaker in a crowd. Directional microphones have been designed to help, Ricketts says, and show significant benefit if the noise is primarily behind and to the sides of the listener. The next, James Bondish, step we see in a few current products is a multitude of microphones placed around the face, such as on eyeglass stems or around a necklace.

Anne Marie Tharpe, Ph.D., associate professor of Hearing and Speech Sciences, teaches a three-semester track of courses specially tailored to teach graduate students in audiology, speech-language pathology and deaf educators to work together for children with cochlear implants. “We study medical, cultural, psychosocial and speech-language aspects, everything,” she says.

Tharpe recently finished a study of the effectiveness of hearing aids for people who are blind and deaf. The total population is small, about 250 adults and children in Tennessee, but people who are deaf and blind have a unique need of using hearing for spatial orientation and mobility through the environment, she says. But many of them do not use hearing aids because they need low-frequency cues, which many hearing aids do not detect, for mobility. In a study of 12 adults and three children, she determined they need hearing aids with a uni-directional microphone to better hear speech, but a multi-directional microphone to help navigation.