Article image
04-26-2020

Alexa privacy controls do not eliminate privacy concerns

A new study from Penn State has revealed that offering new privacy controls for Alexa is not necessarily the best way to gain consumer trust. In fact, the introduction of new privacy and content settings can make some Alexa users feel less secure. 

The investigation was focused on 90 participants who interacted with Alexa through an Amazon Echo device by asking several health-related questions. Some of the users were allowed to customize their privacy controls or content settings, while others were not. 

Many participants who were given the option to adjust their privacy and content settings gained more trust in Amazon Alexa. On the other hand, users who were tech savvy became more skeptical when they were offered privacy controls. 

“That’s kind of counterintuitive,” said study co-author Professor S. Shyam Sundar. “The mere presence of privacy settings seems to trigger thoughts of potential privacy problems among those who are aware of such loopholes in communication technologies.”

“Once you give power users these options and they realize that privacy settings are actually controllable, they tend to panic and see the between-the-lines message rather than see customization for what it is, which is really a benevolent effort to provide more user control.”

The study also revealed that users who were sensitive about their privacy found content less credible when they were given control of their privacy settings. However, when the same users were also allowed to customize their content, they found it to be more trustworthy.

“It is really interesting to see that content customization, which is unrelated to privacy, alleviated the negative priming effects of adjusting privacy settings,” said study lead author Eugene Cho. “The empowering effect of customization noticed in our other studies extend to smart speaker interactions and to the context of privacy.”

However, the quality of content customization services could be impacted by privacy customization settings, said study co-author Professor Saeed Abdullah. This concept is similar to other artificial-intelligence algorithms that draw on user history to generate personalized content on well-known platforms, such as product suggestions on Amazon.

“For example, if you delete your user history or your audio recordings from Alexa, it might mean that the platform cannot personalize its offerings very well for you,” said Professor Abdullah. “Some people might like them, as some people like to have the best recommendations from the systems. And in that case, they might not take advantage of the privacy options.”

“So in other words, the differences between individuals and their perceived expectations of these systems mean that people will use privacy settings in a different way. That’s why providing control is so important.”

As smart speaker use is becoming more and more widespread, so are concerns about privacy infringement. The experts hope that this research will encourage designers and service providers to incorporate various content customization options to relieve privacy concerns.

“If users want the devices to function the way they’re supposed to function, they are supposed to always be on,” said Professor Sundar. “I feel like we’ve reached a point in our cultural conversation about the acceptability of having these kinds of devices in our homes, and to what extent we are comfortable.”

“Our findings can help us to better design smarter, more privacy-sensitive and more trustworthy smart speakers in the future,” added Abdullah.

The research was funded by the National Science Foundation.

By Chrissy Sexton, Earth.com Staff Writer

 

News coming your way
The biggest news about our planet delivered to you each day
Subscribe