New article: Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News

Update May 28, 2020: We were honoured to write a summary piece about this study for the Washington Post. (Note: The Posts’s editorial staff changed our headline in the moments before publication, without clearing the change with us. The headline somewhat misrepresents our findings. The main body of the piece is accurate. My advice is: read the body of the piece in the Post, but ignore the Post’s headline. Then read the full journal article).

***

Cristian Vaccari and I have a new article out in Social Media and Society. The piece is entitled Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News.

For this study, we designed a survey experiment to assess the impact of exposure to three different cuts we made of Buzzfeed’s famous 2018 Barack Obama “educational” deepfake video, featuring Obama and the actor and director Jordan Peele.

We embedded the experiment as a realistic YouTube video at the end of a survey of an online sample representative of the UK adult population based on age, gender, and region of residence (N=2,005). (Special thanks to O3C data partners Opinium Research, who donated access to their survey sample pro bono).

Our main foci were deception and trust. Our survey measured socio-demographic characteristics, political attitudes, and news sharing on social media. We also had pre-treatment measures of behaviour and attitudes, including levels of trust in news on social media.

The Buzzfeed Barack Obama/Jordan Peele educational deepfake

The Buzzfeed Barack Obama/Jordan Peele educational deepfake

Key Findings

In our experiment, the deceptive deepfakes did not directly mislead respondents. Similar percentages of participants in our three groups were deceived, regardless of the video they saw, and these differences were not statistically significant.

But the deceptive deepfakes elicited uncertainty. Significantly more participants who watched one of the deceptive videos expressed uncertainty about the truthfulness of the videos. And participants who expressed uncertainty on the video manifested significantly lower levels of trust in news on social media, even after controlling for pretreatment levels of trust. Thus we find that uncertainty mediates the relationship between the deepfake and trust in news.

Why Are Deepfake Videos So Troubling?

In our discussion and conclusion we reflect on some of the reasons why deepfake videos are so troubling for political communication research. Here is a brief summary.

First, images and videos are more likely than text to be shared on social media. Second, video has a cognitive directness due to what researchers have termed the “picture superiority effect” and the “realism heuristic.” Third, familiarity through exposure to visual media can elicit “fluency” and therefore credulity, irrespective of the truthfulness of the content.

More broadly, deepfake videos challenge conventional scholarly wisdom about how “active audiences” decode problematic information. They are nonfictional, highly proficient forms of deception, based on existing, publicly-available, audiovisual online representations. Can audiences effectively mobilize pre-existing cognitive and informational resources—political knowledge, awareness of news and current events, lived experiences, and cultural reference points, or even their basic familiarity with the facial appearance of public figures, to actively interrogate deepfakes?

The article is open access and free to download.

Here is the full abstract…

Artificial Intelligence (AI) now enables the mass creation of what have become known as “deepfakes”: synthetic videos that closely resemble real videos. Integrating theories about the power of visual communication and the role played by uncertainty in undermining trust in public discourse, we explain the likely contribution of deepfakes to online disinformation. Administering novel experimental treatments to a large representative sample of the United Kingdom population allowed us to compare people’s evaluations of deepfakes. We find that people are more likely to feel uncertain than to be misled by deepfakes, but this resulting uncertainty, in turn, reduces trust in news on social media. We conclude that deepfakes may contribute toward generalized indeterminacy and cynicism, further intensifying recent challenges to online civic culture in democratic societies.

Keywords: misinformation, disinformation, uncertainty, political deepfakes, online civic culture

Download.