11th Speech in Noise Workshop, 10-11 January 2019, Ghent, BE

Eye gaze steering works miracles for hearing aid users in noisy enviroments

Renskje K. Hietkamp(a), Sergi Rotger-Griful, Sedsel Brøndum Lange, Carina Graversen, Tanveer Bhuyian
Eriksholm Research Centre, Denmark

Antoine Favre-Félix
Eriksholm Research Centre, Denmark and Hearing Systems, Department of Electrical Engineering, Technical University of Denmark, Kgs.Lyngby, Denmark

Thomas Lunner
Eriksholm Research Centre, Denmark and Hearing Systems, Department of Electrical Engineering, Technical University of Denmark, Kgs.Lyngby, Denmark and Swedish Institute for Disability Research, Linnaeus Centre, HEAD, Linköping University, Linköping, Sweden

(a) Presenting

People with hearing loss experience problems especially in noisy environments such as restaurants or family dinners. Technical solutions for these problems include external microphones, beamforming features and noise reduction schemes. These solutions provide some benefit but they do not offer the listener the possibility to steer the sound by attention. One way of distinguishing where the attention of the listener lies is by estimating the eye gaze position, which then can be used as input to a sound enhancing system.
In a previous study with hearing aid users, Favre-Felix et al (2018) identified eye gaze through electrooculograpy (EOG) in off-line recordings and used it to enhance the attended speaker in a three-speaker spatial setup. The results showed that listening performance increased, even though the target estimation of the algorithm was only 65%. But even if eye gaze steering would work perfectly all the time, would it be desirable for the hearing aid user in daily life?
This question was addressed in the study presented here: How do hearing aid users perceive the extra benefit from devices that enhance attended speakers by means of eye gaze steering compared to conventional hearing aids? Nine hearing aid users with mild to moderate hearing loss, but otherwise diverse with respect to age, gender, hearing aids experience and aetiology, were equipped with a golden standard device (Vicon motion tracker and Dikablis eye-tracker systems) that identifies eye gaze position in a virtual environment. The eye gaze position was then used to enhance an attended speaker in a competing talker situation (2 speakers in babble background noise), presented as video recordings, live size, using a 88’’ 4K screen and 3 loudspeakers. The users’ reaction to the effect of the equipment was recorded on camera. Furthermore, the end-user benefit was measured by means of speech comprehension scores and subjective evaluation of speech intelligibility and listening effort.
The poster presents the test setup, its ecological validity and the main findings.

Research supported by European Community’s EU Horizon 2020 Programme under grant agreement no. 644732 (Cognitive Control of a Hearing Aid, COCOHA).

Favre-Felix, A., Hietkamp, R., Graversen, C., Dau, T., & Lunner, T. (2018). Steering of audio input in hearing aids by eye gaze through electrooculography. Proceedings of the International Symposium on Auditory and Audiological Research, 6, 135-142. Retrieved from https://proceedings.isaar.eu/index.php/isaarproc/article/view/2017-16

Last modified 2018-12-08 00:23:30