UCI Podcast: Slowing the spread of election misinformation
Cailin O’Connor discusses the false narratives shaping this election, and how to overcome them
The clouds of misinformation swirling on the internet have only thickened this election season, with unfounded allegations of rampant voter fraud and online conspiracy theories that severely strain credulity. To learn more, the UCI Podcast spoke with Cailin O’Connor, an associate professor of logic and philosophy of science at UCI, is an expert on misinformation, and, along with fellow UCI professor James Weatherall, the co-author of the book, The Misinformation Age: How False Beliefs Spread (2019). In this interview, O’Connor discusses the most consequential pieces of misinformation this election season and the ways people can slow their spread.
To get the latest episodes of the UCI Podcast delivered automatically, subscribe at:
Apple Podcasts – Google Podcasts – Stitcher – Spotify
Transcript
AARON ORLOWSKI, HOST
True? False? Or in between? This election season, the two sides aren’t just presenting different policy prescriptions — in many ways, they’re peddling different facts. In recent years, the rate of the spread of misinformation has accelerated, with the rise of conspiracy theories like QAnon and false information about the coronavirus. What misinformation narratives are shaping this election? And how can we stem the tide of falsehoods?
From the University of California, Irvine, I’m Aaron Orlowski. And you’re listening to the UCI Podcast.
Today, I’m speaking with Cailin O’Connor, who is an associate professor in the Department of Logic and Philosophy of Science at UCI. She, along with fellow UCI professor James Weatherall is the author of the book, The Misinformation Age: How False Beliefs Spread.
Professor O’Connor, thank you for joining me today on the UCI Podcast.
CAILIN O’CONNOR
Well thank you for having me, Aaron.
ORLOWSKI
So misinformation is abundant this election season. In your view, what are the most consequential pieces of misinformation that you see circulating as Nov. 3 draws closer?
O’CONNOR
Well, there are some things that are just entirely typical of election cycles related to, for example, smears on different candidates. Now that’s something that has always been the case, of different political candidates trying to smear each other and hurt their reputation. So we’re seeing a lot of misinformation related to that: things like Kamala Harris isn’t really black is one, stuff about connections between Joe Biden and the Ukraine. Something that’s a little less (usual) is an enormous disinformation campaign around voter fraud and the prevalence of voter fraud in the U.S. And the main narratives are that voter fraud is rampant, that it’s being perpetrated by those on the left. There are sub-narratives, like a lot of undocumented immigrants are voting illegally, is one. Another one is that in 2016, Donald Trump would have won the popular vote, if not for voter fraud. And then are little stories about mail-in ballots, especially — they’re being thrown out or they’re being falsified. And so that is really a huge one right now.
ORLOWSKI
Of course we can’t forget the issues surrounding COVID-19 and if not the outright misinformation, then at least dubious information that has been circulating about the coronavirus. So as you look at that issue and sort of the polarization that has arisen around the misinformation around coronavirus, is that something that you would have even predicted just a couple of years ago?
O’CONNOR
No, actually. So this was something that really took me by surprise. And the reason is that we write about in our book differences between different types of false beliefs. So some of them don’t actually matter that much to your day-to-day functioning in your life. So for example, if you don’t believe in evolutionary theory, it just never really comes back to bite you. There aren’t real changes in your behaviors that are going to occur as a result of that belief. And we point out in the book that beliefs like that tend to be really heavily influenced by social things. So it’s a good way to express that you are part of an in-group, by denying evolutionary theory, because it doesn’t hurt you in any way to do so, and then there can be these real social benefits. And we predict that beliefs that have real consequences are going to be less likely to be subject to these social influences. So obviously thinking that COVID-19 is not dangerous or that wearing a mask is a bad idea is the kind of belief that can have real consequences to your life and the lives of others around you. And those consequences can bear down on you pretty quickly. You know, within a couple of weeks, you could be very sick because of bad decisions you make. So we were actually quite surprised to see this level of polarization over such a consequential belief.
ORLOWSKI
Well, and you and Professor Weatherall decided to write the book The Misinformation Age shortly after two really consequential instances of misinformation influencing elections: the 2016 presidential election and the Brexit vote in Europe. So do you think that the nature of misinformation has changed in even that short amount of time? Has it worsened at all since then?
O’CONNOR
I would say probably it has worsened. So what we were seeing in 2016 was all of these attempts by the Russian state to influence these big political events abroad, so in the UK and in the U.S. At that time, that took everyone a little bit by surprise. I think most members of the public and a lot of researchers weren’t expecting these disinformation campaigns. Since then, a lot of research has been done to understand them and how they work. But part of the result is that a lot more people have jumped into the fray since then. So a lot of domestic political groups have kind of taken up the Russian handbook like, well, let’s get online ourselves and start trying to influence people. And just everyday people are also taking up these handbooks. So a lot of people will spend time making political memes and then trying to spread them that support, say, a candidate of their choice or an idea that they are interested in. And so what’s gotten worse, I think, is that there are more people involved. Now, at the same time, some of the major social media platforms are doing a lot more work to try to suppress or control really damaging misinformation and disinformation.
ORLOWSKI
So where do you see a storyline like QAnon fitting into this new misinformation landscape?
O’CONNOR
Oh, my word, QAnon. Just to give listeners a little background, I don’t think everyone is particularly familiar with the QAnon phenomenon. So this is a really widespread — it’s kind of a cluster of conspiracy theories and then a social movement surrounding these conspiracy theories. It started on these websites like 4chan and 8chan. And some of the central ideas are that there is this deep state, this shadowy cabal group, trying to control the government, composed possibly of Satan worshipers and pedophiles, who drink the blood of children, and things like this. And that Donald Trump is at the head of an attempt to end this deep state, to fight the deep state. And there’s then a whole lot of other conspiracy theories that get lumped in to QAnon.
Okay, so how does that fit into this landscape? Well, we already were seeing precursors to this sort of conspiracy theory popping up in 2016. So Pizzagate was actually very similar. We had these right-wing kind of far-right websites, where people were coming up with conspiracy theories. And there was a lot of similar character to it, like, “We’re hunting through emails, looking for coded words.” And now there’s a lot of looking at the posts of this one person, the Q person, who’s supposed to be a government insider, and who’s posting maybe coded messages on 4chan, and we have to figure out what they are. So all of that is a little similar, and it’s a sort of continuation. I think the thing that’s changing is that this conspiracy group is probably influencing mainstream beliefs more than most previous conspiracy theories and campaigns have. And the reason is that QAnon is doing what seems to be a pretty good job of connecting up their wilder beliefs with very palatable messages.
So they’ve taken over this hashtag #savethechildren and promoted a lot of content around the idea that there is rampant child sex trafficking in the U.S. and that we need to act out to protect children. Now, of course, literally every single person is like, “Yes, I am against child sex trafficking. I want to, I want to get on board with that.” And so this has brought a lot of people, especially women, especially younger women, suburban women, to share messages and content that’s coming from QAnon. And it seems like part of the goal might be to then get some of these people to start trusting and looking into more of the content related to the QAnon conspiracy, and ultimately try to get people on board. So we see this kind of almost like a little piece of bait being dangled out into the mainstream. And then when people bite, they’re trying to kind of pull people deeper into these conspiracy theories. And polls suggest that a lot of people actually, at least say in surveys, that they believe in QAnon. So something like half of people who say they support Donald Trump also say they believe various aspects of the QAnon conspiracy. A recent survey found that something like a quarter of the British public said they believed in various QAnon conspiracies. So in these surveys, it’s a little hard to know how accurate that is, but there are at least indications that it’s quite widespread.
ORLOWSKI
Well, and as you mentioned, it’s a whole network of conspiracy theories. So potentially someone could still be in that bait stage. And they haven’t really eaten the whole hook yet, so to speak.
O’CONNOR
I think a lot of people will never swallow the whole hook, but might play with the idea of believing in it, or believe it a little bit, or maybe even express belief to express their support, say, of President Trump. So as I said, it’s a little hard to know when someone says, “Oh, I believe in, you know, that Joe Biden is drinking the blood of children.” Like, do they really? But we see influence by the group and by these conspiracy theories, for sure.
ORLOWSKI
Well, let’s kind of bring the conversation a bit more back to the findings in your book. What are the key ways that misinformation really spreads, in multiple instances?
O’CONNOR
Well, there’s a lot. What we really focus on in the book is the role of social ties and social connections in the spread of misinformation. So a lot of previous authors have looked at human cognition and reasoning to try to understand why we believe false things like the ways that we have biases in our reasoning, or are bad at dealing with probabilities. All of that’s very important to understand. But we really want to focus on social ties for the reason that humans are just deeply social learners. We learn most of the things we believe from those in our social networks, from friends and peers and teachers and parents. So we focus in the book on things like: Who do we trust and why do we trust them? Why do we pick up beliefs from others? What social pressures are we under? Are we trying to conform with people in our networks? Does that lead us to avow or adopt certain beliefs? How do we decide what sorts of reputations people have? Do we think this person is the kind of person who can be trusted to share good information? And then we also look at: How do propagandists take advantage of all these social aspects of our beliefs? So how do they try to get us to conform with certain people so that we keep smoking? Or how do they try to weaponize the reputations of people who are well-trusted, to spread certain beliefs?
ORLOWSKI
So it sounds like one of the kind of foundational lessons is that there are social benefits to believing certain pieces of information, regardless of whether they’re true or not.
O’CONNOR
That’s right. So a typical model for thinking about our beliefs is that they guide our action and in that way they benefit, or they hurt, us. We want to have true beliefs because they allow us to act appropriately in the world. You know, if we think, well, there’s mercury in swordfish, then maybe we eat less swordfish and we’re less likely to be poisoned by mercury. But there’s this other aspect of belief, which is it’s also playing a role in our social world. So sometimes when you are expressing a belief, it’s not just that you’re trying to say what things you really think are true about the world. You’re trying to show another person that you believe the same things as them, that you’re (relatively) similar to them that you’re trustworthy. Or you’re trying to impress others with how much you know. And those aspects are really important too.
ORLOWSKI
So what role do recent technologies, especially social media, play in how these pieces of misinformation can spread?
O’CONNOR
I mean, it’s good to recognize that “fake news” is by no means a new phenomenon. It’s literally as old as humans communicating. So when you think about human communication as opening this door where we can now pass ideas and beliefs from person to person. We opened this door. That’s great because we can spread information throughout our groups, without all having to figure it out, brand new, ourselves. But then at the same time, some false beliefs are going to spread, too. That’s just going to happen. It’s a necessary part of human communication systems. Of course, what’s happened with social media — and this is almost ironic because I think a lot of people thought, well, we’ll have the internet now and no one will ever have to be wrong again. You can look everything up. In fact what’s happened is that social media has changed the ways we communicate, and in some ways has allowed false beliefs and the social spread of false beliefs to completely proliferate.
Now, there are a few different things that matter here. One is that it’s very, very easy for rumors, especially false rumors, to get going on social media. I mean, a falsehood can be literally across the world in a matter of seconds, which was not true given our previous media structures. Or, you know, at least it would take a little longer. Another thing is that people can now choose and prune their social networks. So suppose you are a flat earther. Well probably before the internet and social media, it would be hard for you to find others who shared your beliefs and everywhere you go, people are going to be telling you you’re wrong and kind of pushing back against that. But now you can find people online who will share your beliefs about anything, who will support you in really wild beliefs. So that is another aspect of how our communication structures have changed.
One last one I’ll mention is that, of course, not everyone on social media is who they appear to be. So in person, but even on the telephone or on TV, it’s kind of hard to pose as someone you’re not. It’s not that no one has ever done that; it’s just difficult. But you can get on Twitter and put up a fake picture and a fake tagline and pretend that you’re anything. And using these kinds of identities, we see propagandists attempting to build trust with people and then use that trust to influence them. So, for example, in 2016, we saw a lot of Russian agents who were posing as, say, members of LGBTQ groups, or even animal lovers or gun lovers, and then building trust with people online and then trying to influence them in different ways, often to drive polarization between the left and the right in the U.S.
ORLOWSKI
And it’s pretty hard for a bot to impersonate a human on TV or the phone, at least so far. AI is not quite that advanced.
O’CONNOR
Right, it’s easier to set up these bot networks, and then you don’t even have to be there sock puppeting. You just let them do the work. They’re probably not as effective, often, as human sock puppets, but if they can tweet something 50,000 times and get that hashtag trending, they can obviously influence what things people are looking at when.
ORLOWSKI
Well, and misinformation, isn’t always black and white. I mean, there’s so many shades of gray. Something might be partially true and mostly false, or it may be totally true, but it’s just not really important. So how do misinformation propagandists and others kind of use this gray zone of truth to manipulate the public conversation?
O’CONNOR
I think this is such an important point, Aaron. I think when a lot of people analyze misinformation, they’ll say things like it’s false information or false statements. And it’s really important to realize that a lot of misinformation, or maybe we just want to say misleading content or propaganda, is not actually consisting in false statements. A lot of the time, it’s true statements that just set the agenda, that kind of turn our attention one way. I see memes all the time where there’s no false statement, but maybe there’s like a picture and it’s as if someone’s talking and there are all these implications that would be misleading. So these gray places, as you call them often are very hard to detect algorithmically. So social media sites use algorithms to try to identify misinformation or misleading content. It’s really hard to fact check everything with real people because there’s so much of this content, but there are a lot of ways to get around those algorithms. And one way is to use gray space where you don’t make literally false statements. It also makes it super hard to legislate misinformation and disinformation because, you know, some governing body can say, well, we can’t allow people to share false content, but can they share an embarrassing picture of a candidate with some mocking text on it? Well that wouldn’t fall under that legislation. And so there are all these slippery ways to kind of get stuff past these algorithms or past different rules.
ORLOWSKI
So we’ve talked a lot about how dire this problem is. But are there any solutions to misinformation or is there anything that, you know, both we as individuals and as a society can start to do to, to reign it in?
O’CONNOR
So first let me lay something out, which I think is important to think about and thinking about solutions. Misinformation and misleading content does not stay still. It keeps changing. And part of the reason it keeps changing is because we try to respond to it. So remember fake news, in 2016? You don’t really see very much of that anymore because people figured out how it worked and stopped paying attention to fake news. And so now you see more funny memes or conspiracy videos. But in a few years, probably it’s going to be something else that we’re seeing. So it’s always changing. As we try to adapt to it, new forms of misinformation become dominant because now they’re the ones that are working. What this means is that there aren’t really set-it-and-forget-it type solutions. You know, you can’t just set up the algorithm to detect misinformation and leave it be and be like, well, we’re done with that. Instead, we need to be thinking about solutions that are dynamic and ongoing.
And so what I advocate for is a kind of shift in perspective, where we recognize this as a real threat to our democracy and public health and something that we should be continuing to respond to at all levels of organization. So at an individual level, we should be holding ourselves accountable and our friends accountable for sharing misleading and propagandistic content and thinking like it’s no longer acceptable to do that. It’s harming all of us to do that. We all need to stop. On a platform level, we need to be pressuring platforms to constantly be investigating misleading content on their sites and working to try to suppress or clean up the informational environments of their sites. Academic researchers and industry researchers should continue to work on misinformation and keep looking well, what is the new threat? What’s the new way this is being done?
Members of the traditional media really need to think about, how do our practices possibly promote bad content? So to give an example of people doing this well: In 2016, I think every major news source was writing all the time about Hillary Clinton’s emails, which was really a relatively minor affair, but was promoted again and again online, often by bad actors. Many fewer people are publishing about Hunter Biden and the Ukraine, right? So that’s an example of traditional media doing a better job with misinformation. And then of course our government should be taking this very seriously. So if you look at the EU, they’ve actually done a pretty good job, forming working groups and research groups to try to improve their informational environments. The U.S. government has not taken this very seriously and has not done much to try to protect the beliefs of the U.S. public. So that’s another level on which we should be trying to improve the environments we’re all in.
ORLOWSKI
Well, the election is just days away at this point. So in these final days here, what misinformation should we be aware of or, or wary of, and how can we avoid confusion on Election Day and afterwards?
O’CONNOR
Well, of course, anything that seems to be shaped at telling you that you can’t vote or last minute sketchy changes to where is your vote center. Everyone should be sort of keeping an eye out for that, for any attempts at voter suppression. We really need to be aware of this voter fraud narrative, because part of the goal of that narrative is to invalidate or de-legitimize our election results, to decrease people’s confidence in the electoral process, and to possibly set things up for a challenge to the election, should it not go the way people want. So everyone should be aware that that is where that narrative is coming from. And also (people) should be aware there’s very little documented voter fraud in the U.S. This really just, isn’t a thing that’s a threat to our democracy, in the same way that the misinformation about it is a threat to our democracy.
ORLOWSKI
Professor O’Connor, thank you for joining me today on the UCI Podcast.
O’CONNOR
It was my pleasure, Aaron, and thanks for doing this.