Why teaching robots and humans to trust each other is essential – Professor Helen Hastie

How do you know if a dog trusts you? Dog owners will tell you it’s in the eyes.
Robots and AI systems for people who need help to live independently are going to be tested at the National Robotarium's Assisted Living LabRobots and AI systems for people who need help to live independently are going to be tested at the National Robotarium's Assisted Living Lab
Robots and AI systems for people who need help to live independently are going to be tested at the National Robotarium's Assisted Living Lab

It’s said that dogs have a sixth sense, an ability to read their human companions, which seems to be instinctive. Can we teach robots to do the same?

Creating robots with this level of emotional intelligence and emulating the complex interactions that humans have with one another is very challenging. While robots are already deployed in the workplace – usually performing repetitive tasks with human oversight, like assembling cars on production lines – we hesitate when considering them for more complex roles.

Hide Ad
Hide Ad

To build trust, robots need to understand what the human needs, their intent, and their emotional state. A lot of information can be gleaned from what the human says, how they say it, and through body language. Humans pick up cues from one another and adapt them to different people, tasks, and situations. For robots, this is obviously very difficult.

Our research, which is delivered in collaboration with teams at Imperial College London and the University of Manchester, aims to teach robotic and autonomous systems how to recognise when and why trust is lost with their human counterparts.

Read More
Data Capital: What have the robots ever done for us?

Building trust – a highly subjective concept – brings multiple challenges. Our £3 million research project, funded by UK Research and Innovation as part of their Trustworthy Autonomous Systems (TAS) programme, brings together expertise in robotics, cognitive science, and psychology to tackle this conundrum.

Through a range of experiments, our aim is to explore how to establish, maintain and also repair trust between humans and robots.

For example, one of the fundamental tests we’re conducting as part of our research is maze navigation. If a robot unintentionally gives a human the wrong advice on which way to turn, how does the robot gauge whether it has lost the human’s trust, and how does it then rebuild it?

To answer this, we need to develop a cognitive model that can help robots to better understand human behaviour. Once this, and the appropriate amount of trust is established, robots can make a huge contribution to society, beyond factory production workers.

In our new Assisted Living Lab at the National Robotarium, we will be testing robots and AI systems to see how they can support those with assisted living needs to live independently for longer; and exploring how robots can support and complement carers.

For industry, technology created by the Orca Hub is helping to develop autonomous systems to revolutionise the way renewable energy assets are inspected and maintained. These assets are often difficult to reach and hazardous. We need to create methods to reassure operators that robots are competent in these conditions, especially when working underwater.

Hide Ad
Hide Ad

This research is essential as we design and build robotic and autonomous systems. Only by embedding the appropriate level of trust will we ensure greater acceptance, usability and, ultimately, adoption of robotics in our daily lives.

Helen Hastie is professor of computer science at Heriot-Watt University, joint academic lead of the National Robotarium and a fellow of the Royal Society of Edinburgh. This article expresses her own views. The RSE is Scotland's national academy, bringing great minds together to contribute to the social, cultural and economic well-being of Scotland. Find out more at rse.org.uk and @RoyalSocEd

Related topics:

Comments

 0 comments

Want to join the conversation? Please or to comment on this article.