research Projects

Our goal is to understand and improve hearing by people with hearing loss.


See our Publications for further details.


Our recent review on Cochlear implant research in the 21st century can be found here:

And our introductory article on Smart Hearing Devices:

Sensory input 

Speech processors

Neural stimulation

Perception & Cognition

sensory input

This research investigates different input modalities of sound transmission (acoustic, tactile, electric) and their combinations. We use acoustic simulations of sound perception with cochlear implants and assess whether integration of tactile and auditory input helps CI listening. We also investigate whether hearable devices live up to their claims of helping people with hearing loss.


Electrohaptics

This multi-centre project led by Dr Mark Fletcher at the University of Southampton investigates the combination of tactile and auditory stimulation to enhance speech perception with cochlear implants. A prototype device is being developed.

www.electrohaptics.co.uk

hearables

This collaboration with Dr Saima Rajasingam at Anglia Ruskin University is a series of investigations into listening performance with and attitudes towards Hearables for people with mild-to-moderate hearing loss. 

Related publications

speech processing

We focus on improving speech signals before they are presented to the listener (pre-processing), for example by removing background noise or other acoustic interferences. We use powerful methods from deep learning (deep neural networks) and digital signal processing (adaptive filters) to facilitate speech perception with hearing devices. 

Speech enhancement based on DEEP neural networks 

We develop noise-reduction algorithms based on deep neural networks (DNNs), to enhance speech perception in noisy situations. The DNNs are optimised with many thousand examples of noisy speech and then evaluated in listening studies with cochlear implant and hearing aid users.

Ongoing work includes:

SPEECH ENHANCEMENT WITH ADAPTIVE FILTERING

In this collaboration with Dr Alan Archer-Boyd and Dr Charlotte Garcia we developed an adaptive algorithm to filter-out interfering background sounds in realistic situations (e.g. in a coffeeshop).
In practice, obtaining a reference signal via streaming could facilitate speech perception.  

Related publications

Neural Stimulation

These projects investigate the electro-neural interface and stimulation patterns with cochlear implants and their impact on speech perception. We develop new coding strategies, assess channel interaction effects and build computational models for the electrical stimulation and sound transmission with cochlear implants. 

Speech coding strategies for cochlear implants

We develop novel speech coding strategies to improve speech perception with cochlear implants:

A novel strategy for cochlear implants to improve speech perception in noise and to reduce power consumption. We are improving the robustness to various acoustic scenarios and are developing a real-time implementation.

Spectral blurring in cochlear implants

In this project with Dr Bob Carlyon (CBU) and Prof Julie Arenberg (Harvard) we manipulate channel interaction with cochlear implants to assess its effects on speech perception. Our findings guide future development of speech processing strategies and clinical assessment.

end-2-end models of COCHLEAR IMPLANT STIMULATION

Project led by Dr Tim Brochier with Dr Josef Schlittenlacher, Dr Iwan Roberts, Dr Chen Jiang, Dr Debi Vickers and Prof Manohar Bance.

We have built high-resolution computational models in combination with automatic speech recognition to assess information transmission with cochlear implants.

patient-specific cochlear implant stimulation PATTERNS

Project led by Dr Charlotte Garcia to develop a model-based algorithm (PECAP) for estimating patient-specific excitation profiles to characterise stimulation and neural health patterns.

Related publications

Perception & cognition

We assess different aspects of speech perception. This includes signal qualities, such as spectral and temporal resolution, and speech perception in terms of intelligibility, quality and listening effort as well as through electrophysiological EEG measures. 

spectro-temporal resolution

Project led by Dr Alan Archer-Boyd and Dr Bob Carlyon. 

Investigations of spectro-temporal resolution with cochlear implants using the STRIPES test. 

Speech perception: Listening experiments

We use a range of measures for assessing speech perception, such as measuring speech intelligibility, speech quality and tolerance thresholds for distortions and artefacts. 

Speech transmission index: Neural entrainment 

In this project led by Dr Alexis Deighton MacIntyre we use electrophysiological markers via EEG measurements to assess speech entrainment and develop objective indices of perception. 

Related publications