A new study published in the journal Science Robotics has found that socially interactive robots built to assist seniors with their daily living should be constructed as collaborative and peer-oriented rather than dominant and authoritative. This will maximize humans’ acceptance of their advice and assistance.
“When robots present themselves as human-like social agents, we tend to play along with that sense of humanity and treat them much like we would a person,” explained study lead author Shane Saunderson, a specialist in human-robot interaction at the University of Toronto.
“But even simple tasks, like asking someone to take their medication, have a lot of social depth to them. If we want to put robots in those situations, we need to better understand the psychology of robot-human interactions.”
According to Saunderson and his team, in order to understand how to build better robots, scientists should first understand the concept of “authority” and how to incorporate it into their machines. The researchers found that this concept can be divided into two: formal authority and real authority.
“Formal authority comes from your role: if someone is your boss, your teacher or your parent, they have a certain amount of formal authority,” Saunderson explained. “Real authority has to do with the control of decisions, often for entities such as financial rewards or punishments.
Saunderson and his colleagues devised a humanoid robot named Pepper and used it to help 32 volunteer participants to complete a series of tests. For some subjects, Saunderson was presented a formal authority figure, with Pepper as a simple helper, while for others Pepper was introduced as the authoritative leader of the experiment.
The researchers discovered that Pepper was less persuasive when presented as a strong authoritative figure than when introduced as a peer helper. “Social robots are not commonplace today, and in North America at least, people lack both relationships and a sense of shared identity with robots. It might be hard for them to come to see them as a legitimate authority,” explained Saunderson.
Thus, the scientists concluded that in order to create positive experiences in contexts of human-robot interaction, robots should be constructed as peer-oriented and collaborative rather than dominant and authoritative.
“This ground-breaking research provides an understanding of how persuasive robots should be developed and deployed in everyday life, and how they should behave to help different demographics, including our vulnerable populations such as older adults,” concluded study co-author Goldie Nejat.