Misinformation is a pressing global threat that has sparked a heated debate between Democrats and Republicans on how to tackle the issue. One area where they sharply diverge is the removal of misinformation from social media platforms.
In April 2022, the Biden administration introduced the Disinformation Governance Board, aiming to establish best practices for countering disinformation. However, Republican concerns about its mission led to a halt in the effort just three weeks later. This raises the question: why do Democrats and Republicans have such contrasting attitudes towards content moderation?
A study published in the journal Science Advances, conducted by myself and my colleagues Jennifer Pan and Margaret E. Roberts, sheds light on this issue. We discovered that Democrats and Republicans not only disagree on what is true or false, but they also have different internalized preferences when it comes to content moderation. These preferences may be influenced by moral values, identities, psychological factors, or even the preferences of party elites.
While some individuals strategically desire the removal of misinformation that contradicts their political views, internalized preferences play a much larger role in shaping attitudes towards content moderation. In our study, we found that Democrats are approximately twice as likely as Republicans to support the removal of misinformation, while Republicans are about twice as likely as Democrats to view such removal as censorship.
Democrats’ attitudes may be influenced by the alignment of the content with their own political views, but this can be attributed, at least in part, to differing perceptions of accuracy. Previous research has highlighted the “fact gap” between Democrats and Republicans, which refers to their differing beliefs about what is true or false. For instance, both Democrats and Republicans tend to believe news headlines that align with their own political views. However, the fact gap alone cannot fully explain the substantial differences in content moderation attitudes.
To delve deeper into the factors contributing to these disparities, we examined the preference gap and party promotion. The preference gap refers to the divergence in internalized preferences regarding whether and what content should be removed. Party promotion, on the other hand, involves individuals making content moderation decisions based on whether the content aligns with their partisan views.
In our study, we surveyed 1,120 U.S. respondents who identified as either Democrats or Republicans. We presented them with a set of political headlines that were identified as misinformation based on a bipartisan fact check. Each respondent saw one headline that aligned with their own political views and one that did not. After each headline, they were asked whether they would want the social media company to remove it, whether they considered its removal censorship, whether they would report it as harmful, and how accurate they believed it to be.
When comparing Democrats and Republicans’ overall approach to these headlines, we found compelling evidence of a preference gap. A significant majority of Democrats (69%) believed that the misinformation headlines should be removed, while only 34% of Republicans shared this view. Similarly, 49% of Democrats considered the headlines harmful, compared to just 27% of Republicans. Furthermore, 65% of Republicans viewed the removal of headlines as censorship, whereas only 29% of Democrats held this perspective.
Even when Democrats and Republicans agreed that the same headlines were inaccurate, Democrats were nearly twice as likely as Republicans to advocate for their removal, while Republicans were nearly twice as likely as Democrats to perceive such removal as censorship. Although we did not explicitly investigate the reasons behind these divergent internalized preferences, two possibilities stand out. Firstly, Democrats and Republicans may differ in terms of their moral values or identities. Secondly, they may internalize the preferences signaled by their respective party elites. For instance, Republican elites have framed content moderation as a free speech and censorship issue, which could influence Republicans’ own preferences.
Zooming in on headlines that aligned or misaligned with Democrats’ views, we observed a party promotion effect. Democrats were less inclined towards content moderation when the misinformation aligned with their own political views. They were 11% less likely to support the removal of headlines that aligned with their views and 13% less likely to report them as harmful. We did not find a similar effect among Republicans. Our study suggests that party promotion may be partly driven by different perceptions of headline accuracy. When focusing solely on Democrats who agreed that the headlines were false, the party promotion effect diminished to 7%.
Considering the implications for social media platforms, we find it encouraging that the effect of party promotion is relatively small compared to the impact of internalized preferences, especially when accounting for accuracy perceptions. However, given the substantial partisan differences in content moderation preferences, we believe that social media companies should go beyond the fact gap when designing content moderation policies that aim for bipartisan support. Future research could explore whether reaching consensus on moderation processes, rather than individual content moderation, could help reduce disagreement. Additionally, alternative forms of content moderation, such as downweighting, which involves reducing the visibility of certain content, might prove to be less contentious.
Lastly, if the preference gap, rooted in deep-seated differences between Democrats and Republicans, is indeed driven by value disparities, platforms could attempt to appeal to both sides of the partisan divide by using different moral framings. For now, Democrats and Republicans are likely to continue disagreeing on whether removing misinformation from social media enhances public discourse or amounts to censorship.