SPRING (EU-funded)

Robots have been introduced to public spaces like museums, airports, shopping centres, and hospitals. These are complex environments for social robots to move, see, and converse in.

Eight research institutes are working on the SPRING project to tackle these challenges in a hospital memory clinic.

I am part of the Heriot-Watt team, focusing on the conversations that people will have with the robot.

My particular focus is on multi-party interaction. Today's systems expect to chat with one person at a time (e.g. Alexa), but patients will likely take a carer or family member along.

This work was recently in TIME Magazine, here.

Designing Conversational AI for People with Dementia

Speech production changes as dementia progresses, but today's voice assistants are trained on huge datasets. They therefore work very well for the 'average' user, but not for groups who's speech differs from the norm.

I am collecting interactions between people with dementia and Amazon Alexa devices to find out exactly what speech changes occur, and which changes cause the voice assistant to misunderstand. We know that mid-sentence pauses become more common and more pronounced for example.

My focus is predominantly on how to tweak current voice assistants to make them more accessible for people with dementia, and more naturally interactive in general. I am a big advocate for voice assistant accessibility.

Past Projects

Voice Assistants for Visually Impaired People in the Kitchen

Malnutrition is commonly associated with sight impairment because it is very difficult to shop, prepare food, and eat a meal. I had the pleasure of supervising 30 MSc students with this setting in mind. We published two papers and created an assistant focused on:

Textual information is found all over food labels. It is therefore impossible for a blind or partially sighted person to know whether their food has expired, follow the cooking instructions, find nutritional info, or check the ingredients for allergies. We developed our system to answer questions like "Is this safe to eat?", or "Is the soup vegetarian?".

Unlike the fridge, sink, or oven - utensils and ingredients move around the kitchen and can be lost. Using the stationary objects as 'anchor points', we could give the user more specific location information "just to the left of the microwave" than other VQA systems.

Trust and explainability is critical in this domain. We therefore designed our system to be transparent and answer follow-up questions like "how sure are you about that?".

Misc Conversational AI

In addition to the large projects above, I have worked on many smaller scale projects including (not exhaustive):

Detecting when people are saying inappropriate things. Swearing does not always indicate offence (especially in Scotland), and seemingly innocuous terms like "sleep with" can be used in inappropriate sentences. We therefore trained various models to detect this in voice assistant interactions.

People ask questions more conversationally when speaking with a voice assistant. I analysed how these questions differ, and which differences cause problems for today's systems.

Helping Doctors Manage Patients

Before stepping into the world of conversational agents, I worked as a machine learning engineer - focused on information extraction from unstructured data into graphs.

In one project with the NHS in Scotland, I used NLP to ensure patients that required critical care after being discharged from the hospital were highlighted to the patient's doctor.

I worked on similar projects with the Scottish Government.