In addition to the University of Oldenburg, the Jade University of Applied Sciences, the Fraunhofer Institute for Digital Media Technology IDMT, the Hörzentrum Oldenburg gGmbH, RWTH Aachen University and the Technical University of Munich – all leading institutions in the field of hearing research – are involved in this large-scale project which is scheduled to run for a total of twelve years.
"In our ageing society it is becoming increasingly urgent to develop hearing aids and other communication aids that work effectively in difficult acoustic environments and really help people in daily life. Oldenburg's hearing research is doing excellent work and is highly recognized both nationally and internationally. The German Research Foundation's renewed funding commitment underlines this in an impressive way," said University President Prof. Dr. Ralph Bruder.
The Collaborative Research Centre Hearing Acoustics brings together various disciplines, in particular acoustics, psychoacoustics, audiology, engineering sciences and physical modelling. In the first funding period the focus was on the interactions between people with impaired hearing and their acoustic environment. "In real life, the hearing situation changes constantly because people react to voices and sounds. For example, they turn their head towards the sound source, or shift their gaze in that direction. We call this the 'acoustic communication loop'," says Hohmann. This dynamic loop had received little attention in hearing acoustics in the past, he notes.
In the last few years the team has succeeded in incorporating the hearing aid into this acoustic communication loop. "We have developed a first prototype of the so-called 'immersive hearing aid' which constantly assesses the acoustic situation and identifies which sound source a test person is directing their attention towards at a given moment," Hohmann explains. The device determines the direction of the test person's gaze and head movements and then adjusts the signal processing to ensure that the targeted sound source can be optimally heard by the test person. The current prototype can be used in field experiments as well as in the lab.
Among other factors, new perception models developed by the research team for use in different hearing situations have paved the way for this success. "These models predict how a test person will perceive a sound signal in a given situation – whether or not they will be able to follow a conversation in a noisy environment, for instance," Hohmann explains. Simulating hearing with and without hearing impairment in different hearing situations involving background noise and reverberation is essential for the development and evaluation of innovative methods for signal processing in hearing aids, he stresses.
Another important result from the first funding period is the "hearpiece" – a special, particularly high-quality earpiece for research purposes. Inserted in the ear and featuring several integrated microphones and small loudspeakers, the device can boost sound in exactly the same way as a hearing aid. The researchers can use it to test new algorithms for signal processing directly in the ear, for example. The special feature here is that the hearpiece is acoustically transparent – which means that hearing with this device corresponds to normal hearing with an open ear. "Thanks to the interdisciplinary collaboration within the CRC we were able to combine acoustics and signal processing methods and have made considerable progress as a result," says Hohmann.
The team has also developed an interactive, audiovisual virtual reality set-up in the lab for conducting hearing experiments with test subjects under controlled conditions. With this technology, real-life situations can be simulated more realistically than was previously possible. To this end, the team created several complex audiovisual scenarios in which test persons can "immerse" themselves, including a virtual restaurant, an underground station and a living room. These scenarios, together with the related data, have been made freely available to research laboratories across the world so that they can conduct their own hearing experiments.
In the second funding period that will now commence, the CRC team plans to refine and merge its perception models, algorithms and applications. One goal is to develop algorithms for the hearpiece and the immersive hearing aid that can actively control noise depending on the acoustic scenario. To do this, the researchers are using cutting-edge AI methods which they themselves developed. The long-term goal is for each hearing aid to constantly learn and get better at predicting which setting is optimal for the respective user in a specific situation. People with impaired hearing are to be able to enter the necessary feedback themselves via their smartphone. "However, we still have a lot of work to do before we reach this goal," notes Hohmann.
MEDICA-tradefair.com; Source: Carl von Ossietzky University of Oldenburg