Revealed! Exposed! Unbelievable! The shocking hypothesis why misinformation is out of control
A plea for treating misinformation as a demand problem
I really hope you will share this post.
But, if I present it in an ordinary way, with a straightforward opening thesis paragraph, will that be exciting enough?
Well, here goes the normal approach: This post will argue that we should think of misinformation as a demand problem first, and as a supply problem second. Partisans are reading and sharing this stuff because it makes their political opponents look bad, and by extension, it makes their side look better and righter and stronger. And because American politics is dominated these days by hatred of the opposite party, too many Americans are eager to read and share partisan misinformation. They need it for their sense of self-worth, and to feed their addiction to the warm glow of partisan righteousness.
This thesis comes from a new New America report, “Why Americans Crave Fake News: How Our Electoral System Drives Demand for Misinformation,” by Aaron Tiedemann. You should read it. It’s a terrific overview of lots of current scholarship on polarization and misinformation. It explains why we should think of misinformation as a demand-side problem, driven by escalating partisan animus
The rest of this post will muse on the report, and the general problems of misinformation and partisan animus.
Ready to share yet?
Ok, I get it — that intro probably wasn’t exciting enough for you. Got it. Let’s try something different
Again, but more viral: “Exclusive: Leaked Tapes Expose Russian Operatives Blackmailing Congressman Gaetz into Proposing Anti-Ukraine Resolution.”
If I really wanted to get something viral, I might need to play on your emotions, engage your outrage, and get smart about your partisan prejudices. So how about this?: “Exclusive: Leaked Tapes Expose Russian Operatives Blackmailing Congressman Gaetz into Proposing Anti-Ukraine Resolution.”
Ooooo… Now that is some juicy juice. And it’s totally fake. I made a story about it up, with help of a chatbot. Oh, and I just published it, online, in a made-up newspaper, The Faux Worth Star-Telegram.
If I had a small team, I’d guess we could pump thousands of these out a day, perfecting the realism, tracking which stories got the most engagement, and making more like them. I produce it here only to note how easily one can create this content.
Could I fall for something like this, under a more credible presentation? Probably. I already hold a very low opinion of the Republican congressman from Florida, no doubt helped along by much negative coverage of the man. My mental model of Gaetz is that he is narcissistic nihilist with no moral center. Gaetz has already introduced a resolution cutting off funding for military support. Is it that hard to believe he’s under Russian sway?
This could be unfair to him. Others see him as a principled fighter against big government. These people would easily see this fake news as fake news for a simple reason — it doesn’t fit with their mental model of him. And it certainly doesn’t help their side to share it. But they might believe and share similar allegations about a Democrat who opposed support for Taiwan acting under the sway of Chinese operatives, say.
Is sharing really caring? Or is it just self-presentation and reputation management?
Why do we engage with and share stories, posts, and other content, both on social media and through e-mail and conversation?
Broadly, cognitive psychology suggests two motivations: accuracy and usefulness (to your preexisting views)
First motivation: accuracy.
Sometimes we are just trying to make sense of the mysterious wide world. We want to understand it better. And so we read to learn. And when we find something useful or interesting, we share with others who are curious, too.
So, for example, you might have been trying to understand whether open primaries could moderate politics. And just for example (totally random): you might have stumbled on my big report about primary reform(gold!). And maybe you know others who share your interest and have no agenda — just curiosity. Well, here we go. Share away!
Second motivation: usefulness to your preexisting views
You may, however, be a supporter of open primaries. In which case, you would not want to share my report, because it casts tremendous doubt on the hypothesis that primary reform changes much. You’d instead want to find some other report out there that shows why open primaries are awesome. You’d want to show me what my report missed, or why I’m just plain wrong.
We all love to be right and hate to be wrong. Being wrong hurts. It undermines our sense of self. Being right feels good. This is why “confirmation bias” and “motivated reasoning” drive so much of our information consumption. We seek out the information that confirms what we already think. We are very motivated to reason against anything that challenges our beliefs. We avoid and dismiss evidence contrary to our beliefs.
Importantly, we don’t do this in isolation. We are social creatures. We depend on our reputations. And so: we really like others to know that we are right. And a great way to show others that you are right? Share information that proves you are right!
Some beliefs and positions are more core to our identities. These are the beliefs we protect the most. Some beliefs bind us closely with others. We also protect these beliefs closely. And a great way to show you are a trusted member of any community is to reiterate the catechisms of the community. On social media, this often involves either “virtue signaling” or “piling on.” Sometimes these are one and the same.
It might be better if all information-sharing and debate were just pure pursuit of accurate truth, driven by curiosity. But it isn’t. We didn’t evolve what we call reason to pursue the truth. We evolved it to get along with each other. When we give reasons, we are justifying ourselves to each other, and attempting to coordinate our collective activity. We argue and reason not to seek some objective truth, but to convince others that we are right.
If we collectively achieve something close to truth, it is most likely a byproduct of diverse participation. I’m here reminded of a wonderful line from the political scientist Jenna Bender: “In a democratic system, diversity substitutes for neutrality.” What she means is that the way we agree in democracy is not by one side imposing its truth on the other side, but by lots of perspectives coming together and finding where they agree.
This line comes from her article, “Polarization, diversity, and democratic robustness.” It is a really, really insightful piece of work. And it also points directly to the key danger we are facing right now in American democracy. When all of politics collapses into a single us-versus-them dimension, this is where we get into big trouble. We lose the diversity of perspective necessary for compromise. And we become much more susceptible to misinformation.
Does partisan animus fuel demand for misinformation? It sure looks that way
Here is a conclusion from a recent article (“Partisan Polarization Is the Primary Psychological Motivation behind Political Fake News Sharing on Twitter”) in the flagship American political science journal:
“Individuals who report hating their political opponents are the most likely to share political fake news and selectively share content that is useful for derogating these opponents. Overall, our findings show that fake news sharing is fueled by the same psychological motivations that drive other forms of partisan behavior, including sharing partisan news from traditional and credible news sources.”
In other words, people who are sharing this stuff are basically sports hooligans hurling insults at the opposing team. They’ll take whatever rumors they can find. They know exactly what they are doing. Nobody is being fooled.
But here is the good news: most people are not reckless hooligans. Again, I quote the authors: “fake news sharing is a relatively rare phenomenon that flourishes only among small segments of the population.” The authors find that, during the study period, only 11 percent of their participants shared any fake news.
But if my Democratic partisans out there want something partisan-boosting to share, this paper also has a neat tidbit. Wait for it…“Republicans were more likely than Democrats to share fake news sources.” Go ahead Democratic readers, post away with a link to my substack.
And you’ll probably like the reason behind it, too: “the supply of useful real news stories is lower for Republicans, propelling them to turn to fake news sites for material.” That is, to find negative stories about Democrats, Republicans had to turn to misinformation. Democrats find plenty of real news stories they want to share bad stuff about Republicans. Ah, confirmation bias. How sweet that dopamine hit of having been right all along.
But wait, what’s that I hear from my Republican readers? Are those the gears of motivated reasoning I detect grinding?
I can sense the doubts: How can you trust these academic studies? Academics are so biased against Republicans! I bet their methodology is flawed. How do they define “real” news? And even so, isn’t the problem that the mainstream media won’t publish anything bad about Democrats because all the reporters are Democrats, and they constantly attack Republicans? So if this study proves anything, it just proves academics and the media hate Republicans!
But, I hear you say: what if we could just educate people more about misinformation? Again, this misses the core driver. People who share this crapola do not care about being factually correct. They care about sticking it to their opponents. They care about proving themselves to their community. It’s status and self-presentation, all the way down.
Thus, as another recent academic journal article (“Affective Polarization and Misinformation Belief”) concludes, misinformation spreaders are not dupes. They are engaged in information warfare and propaganda for a cause. They are not on a fact-finding mission.
Or, for those of you who prefer the cold authority of academic-ese:
“the association of affective polarization with misinformation belief does not depend on a set of the electorate that is less politically sophisticated and more prone to believing misinformation. Instead, the typically preventative effects of political sophistication on misinformation belief are close to entirely inapplicable to the highly affectively polarized when it comes to in-party-congruent misinformation belief.”
What, me worry? (About misinformation)
There are obviously many reasons to worry about misinformation.
Top of mind, probably: election outcomes: In a closely-balanced two-party system, even a small amount of misinformation (or more realistically, misinformation-adjacent innuendo and rumor) could have decisive impacts on elections. I’m not convinced that misinformation has tipped any elections yet. But it certainly could. And in a winner-take-all system, even a tiny vote plurality can yield titanic shifts in power.
But I also worry about two particular “doom loop” consequences.
First, what I’ll call the cynicism and mistrust doom loop.
Misinformation (and probably worse, rampant misinformation adjacent rumor-mongering) erode trust in authority. If you can’t believe anything, then everything becomes believable as anything else. And if so, when somebody tells you: “you are being lied to” you might feel vindicated and trust this person. It also, dangerously, allows politicians (like, say, Donald Trump) to dismiss any allegations against him as lies and fake news. Muddling authority thus undermines authority.
This distrust fuels cynicism. Cynicism drives both demand for and consumption of misinformation. And Americans are feeling pretty dismal about our political institutions these days. Not a good place.
Second, the “subversion dilemma” doom loop.
The phrase “subversion dilemma” comes from a working paper from a Berkeley-MIT team of researchers. Here is the dilemma: If you believe the opposing party is likely to subvert democracy, you are more likely to support your party subverting democratic procedures first.
As the authors write: “Under these conditions, even democracy-loving opponents of an aspiring-autocrat are placed in the difficult situation: how to save democracy from an aspiring autocrat without escalating this cycle of mutual fear and democratic subversion?”
Misinformation can obviously make the subversion dilemma much much worse.
Many supporters of Donald Trump sincerely believe that Democrats lie, cheat, and steal to win elections. They believe that by supporting Trump and Republicans, they are saving democracy. They are obviously misinformed. But because these partisans dislike Democrats so much, they are eager and willing perpetrators of the “Democrats are committing fraud” narrative.
But as a Democrat, I see Republicans doing wildly anti-democratic things. Am I just seeing everything through my partisan-based confirmation bias? I don’t think so. And so the logical conclusion is that Democrats can’t stand by idly. But hearing me say that — does that cause Republicans to get further radicalized? Are we all trapped in a doom loop?
I demand more focus on the demand side of the misinformation equation
I often get frustrated in conversations about media and truth, because so much of the baseline is rooted in a particularly rosy view of the post-war American landscape, in which a politics of consensus was undergirded by three major news networks and most cities having a widely read major newspaper or two. This was a historical aberration, and also one that left out many, many voices. And it also had plenty of misinformation. One only had to scan the AM dial to find John Birch Society rants, or subscribe to their newsletters, for example.
So, misinformation has always been with us. But is it worse now? For sure it is.
So why is it so bad now?
One obvious reason is that it is easier and cheaper to produce and distribute than ever before. My fake Matt Gaetz post was a breeze, thanks to new AI technology. The internet is everywhere. That’s the supply problem. And it’s a doozy of a problem.
But demand can drive supply. And what if we could make progress on the demand side of the equation? That’s my hope.
I admit, my suspicion that our binary zero-sum politics is driving demand for misinformation is just a strong hypothesis. I’d want to see more comparative research here. Almost all the research I’ve seen on misinformation and fake news sharing is based on patterns in the United States. Maybe there’s research I don’t know about. If there is, please tell me.
So here is my simple plea: Can we all spend more time investigating the demand-side of the misinformation equation? And can we all think harder and deeper about the ways in which our electoral and party system may be both driving the demand for misinformation, and making our political system more vulnerable to its awful impacts?
I’ll leave you here with the conclusion of the New America report, “Why Americans Crave Fake News.”
“To understand misinformation, and ultimately counter it, we must better understand its drivers. Failure to do so has left us with a variety of well- intentioned but ultimately localized responses that may serve as useful bandaids, but will not tackle the root of the misinformation problem. Central to the failure of these reactions to misinformation is an over-emphasis on the supply side of the equation, rather than the factors that ensure politicians, pundits, and voters all demand misinformation. We should worry much more about these sociological drivers than misinformation’s power to fool unsuspecting voters or spread via inherently dubious new technology. Looking at these factors, particularly the power of social identity and affective polarization, leads to a more challenging, but ultimately more productive, understanding that the desire for misinformation is deeply embedded in our political culture and our democracy. “To understand misinformation, and ultimately counter it, we must better understand its drivers.” Affective polarization, one of the strongest factors for misinformation sharing, is fed—at least in part—by the incentive structure of oppositional politics.”
Now, please help me make this analysis go viral by sharing it on social media. Remember, sharing wonky and tl;dr-length substack posts makes you seem smart and in-the-know. Am I right? I sure hope so. I hate being wrong.
And to share away, here are all my buttons to press:
Seems very both-sidesy even after you take into account Drutman's thesis. You can talk about all manner of fancy concepts like “confirmation bias” and “motivated reasoning” but is there really anything to this analysis beyond saying people are often biased and biased people inevitably handle information in biased ways? Thanks, but we already knew that.
And why do we have to choose between demand and supply side analysis at all? Isn’t it necessary to do both? Isn’t that like saying the passing game in the NFL is all about the quality of receivers and you should just ignore the quality of quarterbacks?
The more I've thought about it, the more I feel that, if fake/manufactured news stories on Facebook really were enough to move the needle in 2016, or could be enough for any future election, then I am FAR more worried about the American public than I am about any foreign bad actor. If we're susceptible to that sort of crap, then it's on us.
Great article.