Would you trust a robot with personal information? UEA scientists trying to make chatbots ‘trustworthy’

From banking and insurance to shopping and dating, chatbots are becoming increasingly ingrained in our ever more digital society.

But organisations using these human-like computer programs are coming up against a serious barrier: talking to a robot does not inspire the same trust as talking to a person, meaning users are reluctant to disclose sensitive information to them.

Now, a team of researchers at the University of East Anglia (UEA) are launching a project to make chatbots more trustworthy, examining how their personality and even appearance can affect how users perceive them.

Lead researcher Dr Oliver Buckley, from UEA's school of computing sciences, said businesses and governments were increasingly turning to chatbots to feed user demand for fast, reliable and accurate information.

"Chatbots are all around us, particularly in customer support roles. They're speaking to us on the phone, emailing us, and responding to text messages - with answers to queries and even providing advice and guidance," he said.

"They are very convincing but a big problem is that people don't trust them with sensitive or private information, for example to do with their health, or banking.

"We want to know how chatbots can become even more personable to encourage people to disclose sensitive or confidential information."

Most Read

The PRoCEED (A Platform for Responsive Chatbot to Enhance Engagement and Disclosure) project, which has received £500,000 of funding from the Engineering and Physical Sciences Research Council, will involve researchers from the UEA's school of computing sciences and school of psychology, the University of Kent, Oxford Brookes and Cranfield University.

Looking at three key sectors for chatbot use and sensitive information sharing - healthcare, defence and security and technology - the team will investigate the implicit trust a user has with a chatbot and how the context in which information is provided can play a role in its perceived sensitivity.

Dr Buckley said: "In order to fully understand the use of chatbots, it is essential to properly understand the nature of personal, sensitive information and also their perceived trustworthiness."

Become a Supporter

This newspaper has been a central part of community life for many years. Our industry faces testing times, which is why we're asking for your support. Every contribution will help us continue to produce local journalism that makes a measurable difference to our community.

Become a Supporter
Comments powered by Disqus