Ethics and Responsible Research

User concerns regarding an assistive robot, art work created by Sam Church for the University of Nottingham, all rights reserved.

Responsible Research and Innovation is ‘a moral imperative. Research and Innovation should be environmentally protective, answering needs, ethical values, and expectations of society and beneficial to the widest range of actors’ (de Saille, S. 2015[i]).

In our research of service robotics technologies, we ensure that we take a responsible Innovation approach at every stage in developing cyber-physical health and assistive robotics technologies.

According to EPSRC [ii], a Responsible Innovation approach should be one that continuously seeks to:

[i] De Saille, S., 2015. Innovating innovation policy: the emergence of ‘Responsible Research and Innovation’. Journal of Responsible Innovation, 2(2), pp.152-168.

[ii] EPSRC Framework for Responsible Innovation: https://www.ukri.org/about-us/epsrc/our-policies-and-standards/framework-for-responsible-innovation/ 


Our research objectives in this area include:

1.      Developing contextually relevant and actionable ethical frameworks for cyber-physical health and assistive robotics technologies

2.      Reviewing existing ethical guidelines and ensuring appropriateness and relevance to emerging technologies

3.      Understanding ethical issues for real-world use of emerging technologies

4.      Ensuring equitable use and access to cyber-physical health and assistive robotics technologies

Research Topics

Related Research Projects

Empowering Future Care Workforces aims to understand how health and social care professionals can benefit from using assistive robotics on their own terms. Empowering health and social care professionals through digital technologies has long been a goal in health and care policy. As governments invest in post-pandemic digital transformation, ensuring workers are empowered and not excluded by technology is more urgent than ever.

UoN Project Lead: Praminda Caleb-Solly


The aim of Accessible AI@Nottingham and our network activities is to build public trust through promoting transparency of AI decision making. We have designed a series of activities for pro-active engagement and aim to empower people to be confident in accessing, understanding and exploiting data. 

UoN Project Lead: Praminda Caleb-Solly


The BS 8611 gives guidance on the identification of potential ethical harm and provides guidelines on safe design, protective measures, and information for the design and application of robots. BS 8611 builds on existing safety requirements for different types of robots; industrial, personal care, and medical. BS 8611 describes ethical hazards associated with the use of robots and provides guidance to eliminate or reduce the risks associated with them. Significant ethical hazards are presented, and guidance is given on how they are to be dealt with for various robot applications.  The AMT/10/1 Technical Committee is currently doing a review of the standard and updating its content.

UoN Project Member: Praminda Caleb-Solly 

Related Publications