Background
Communicating in noisy environments such as restaurants, festivals and exhibitions remains troublesome for people suffering from impaired hearing. Traditionally, hearing aid devices have amplified all surrounding sounds of the user, making the users avoid these types of rowdy environments. This project seeks to improve the experience of hearing aid users by focusing on understanding the environment and extracting useful sound to the user. This was done with the assistance of a pair of smart glasses in combination with two hearing aid devices. This project is in collaboration with the hearing aid company Oticon and Linköping University and has been going on for several years.
Presentation Video
Acheivements
- Noise reduction using AI assisted speech recognition
- Estimating the direction and range to a human voice relative to the user.
- Selecting the direction of sound amplification using the gaze of the user.
- Detecting and tracking a face in the field of vision of the user. Estimating the 3D-position of the tracked face.
- Estimating the orientation of the user’s head using the IMU in the glasses.
System Overview
Our Team
Malva Eveborn
Documents
Documents
Elias William
Project Leader
Project Leader
Oskar Ramsberg
Design
Design
Mohammad Alasmi
Hardware
Hardware
Adam Roos
Test
Test
Rasmus Olofsson
Software
Software