Accessible Human-Robot Interaction and Collaboration

To maintain human dignity and autonomy, it is important that the person or people interacting with cyber-physical health and assistive robotics technologies can provide guidance and instructions in mutually supportive task. Ensuring safe, effective, accessible and even enjoyable interaction is particularly crucial for people with different levels of physical or sensory impairments. For example, if a robot is being used by a disabled person to eat a meal, the person might want to direct the robot through their gaze or verbally guide a physically assistive robot doing the physical task as to what food items to pick up and how, such as dipping a bread-roll in the soup before offering it to them or how much gravy to mix in with the mashed potato. Where elements of the task might be challenging for a robot to decide for itself, or relate to personal preferences, determining effective modalities and strategies for collaborative Human-Robot Interaction (HRI) is particularly important. At the same time the system needs to be able to learn preferences and skills which can make future interactions more efficient.

There are several research areas within HRI which can be explored for activities such as dressing, mobility, medication management, physiotherapy and rehabilitation

Research Topics


Accessible interfaces for people with sensory, cognitive and physical impairments using a range of sensory modalities

This ranges from brain-computer interaction, gaze tracking to haptic feedback to make interaction with technologies easier, cost-effective and efficient.  

Typical research themes include:

Collaborative Human-Robot Interaction

When considering variable autonomy (including adaptive control) we need to determine who should have initiative in different interactions scenarios and contexts. Typical research themes include:

Decision-support systems with easy-to-use dashboards and tools for healthcare professionals

Designing and evaluating interfaces for healthcare professionals to support safe, effective remote monitoring and management of people with long-term conditions or for example, those who have been discharged early from hospital and require remote clinical monitoring, is crucial for successful integration of intelligent assistive technologies into care pathways.  Typical research themes:

Related Research Projects

TARICS aims to create an interactive cultural experience for museum visits by provisioning a social robot to make the experience more accessible and inclusive for people with learning disabilities and/or autism. The proposed learning and interaction techniques will be deployed on Lindsey, a robot tour guide operating autonomously in The Collection Museum since 2018.

Contact: Ayse Kucukyilmaz


Open All Senses will explore people’s feelings of safety and comfort when controlling and interacting with telepresence robots and will provide opportunities for transforming their design and utility.

Project Lead: Horia Maior

Related Past Research Projects with CHART team member involvement


FLOURISH, Led by Atkins,  was a multi-sector collaboration that helped to advance the successful implementation of Connected and Autonomous Vehicles (CAVs) in the UK. While at UWE Praminda Caleb-Solly worked on designing and evaluating accessible interfaces for people with aging-related impairments who could be early adopters of autonomous vehicles. 


The main objective of the I-DRESS project , which was based at UWE, was to develop a system that would provide proactive assistance with dressing to disabled users or users such as high-risk health-care workers, whose physical contact with the garments must be limited to avoid contamination. The robotic system consisted of two highly dexterous robotic arms, sensors for multi-modal human-robot interaction and safety features.

The system comprised three major components: (a) intelligent algorithms for user and garment recognition, specifically designed for close and physical human-robot interaction, (b) cognitive functions based on the multi-modal user input, environment modelling and safety, allowing the robot to decide when and how to assist the user, and (c) advanced user interface that facilitates intuitive and safe physical and cognitive interaction for support in dressing.  

Praminda Caleb-Solly led the work package on Human Factors and Interface Design when she worked at UWE, and is continuing aspects of the research on safety in close-proximate human-robot interaction. 

Related Research Publications