Americans Think They Know a Lot About Politics—and It’s Bad for Democracy That They’re so Often Wrong in Their Confidence
Many Americans think they know much more about politics than they really do. That overconfidence can thwart democratic politics.
As statewide primaries continue through the summer, many Americans are beginning to think about which candidates they will support in the 2022 general election.
This decision-making process is fraught with difficulties, especially for inexperienced voters.
Voters must navigate angry, emotion-laden conversations about politics when trying to sort out whom to vote for. Americans are more likely than ever to view politics in moral terms, meaning their political conversations sometimes feel like epic battles between good and evil.
But political conversations are also shaped by, obviously, what Americans know – and, less obviously, what they think they know – about politics.
In recent research, I studied how Americans’ perceptions of their own political knowledge shape their political attitudes. My results show that many Americans think they know much more about politics than they really do.
Knowledge deficit, confidence surplus
Over the past five years, I have studied the phenomenon of what I call “political overconfidence.” My work, in tandem with other researchers’ studies, reveals the ways it thwarts democratic politics.
Political overconfidence can make people more defensive of factually wrong beliefs about politics. It also causes Americans to underestimate the political skill of their peers. And those who believe themselves to be political experts often dismiss the guidance of real experts.
Political overconfidence also interacts with political partisanship, making partisans less willing to listen to peers across the aisle.
The result is a breakdown in the ability to learn from one another about political issues and events.
A ‘reality check’ experiment
In my most recent study on the subject, I tried to find out what would happen when politically overconfident people found out they were mistaken about political facts.
To do this, I recruited a sample of Americans to participate in a survey experiment via the Lucid recruitment platform. In the experiment, some respondents were shown a series of statements that taught them to avoid common political falsehoods. For instance, one statement explained that while many people believe that Social Security will soon run out of money, the reality is less dire than it seems.
My hypothesis was that most people would learn from the statements, and become more wary of repeating common political falsehoods. However, as I have found in my previous studies, a problem quickly emerged.
The problem
First, I asked respondents a series of basic questions about American politics. This quiz included topics like which party controls the House of Representatives – the Democrats – and who the current Secretary of Energy is – Jennifer Granholm. Then, I asked them how well they thought they did on the quiz.
Many respondents who believed they were top performers were actually among those who scored the worst. Much akin to the results of a famous study by Dunning and Kruger, the poorest performers did not generally realize that they lagged behind their peers.
Of the 1,209 people who participated, around 70% were overconfident about their knowledge of politics. But this basic pattern was not the most worrying part of the results.
The overconfident respondents failed to change their attitudes in response to my warnings about political falsehoods. My investigation showed that they did read the statements, and could report details about what they said. But their attitudes toward falsehoods remained inflexible, likely because they – wrongly – considered themselves political experts.
But if I could make overconfident respondents more humble, would they actually take my warnings about political falsehoods to heart?
Poor self-assessment
My experiment sought to examine what happens when overconfident people are told their political knowledge is lacking. To do this, I randomly assigned respondents to receive one of three experimental treatments after taking the political knowledge quiz. These were as follows:
-
Respondents received statements teaching them to avoid political falsehoods.
-
Respondents did not receive the statements.
-
Respondents received both the statements and a “reality check” treatment. The reality check showed how respondents fared on the political quiz they took at the beginning of the survey. Along with their raw score, the report showed how respondents ranked among 1,000 of their peers.
For example, respondents who thought they had aced the quiz might have learned that they got one out of five questions right, and that they scored worse than 82% of their peers. For many overconfident respondents, this “reality check” treatment brought them down to earth. They reported much less overconfidence on average when I followed up with them.
Finally, I asked all the respondents in the study to report their levels of skepticism toward five statements. These statements are all common political falsehoods. One statement, for example, asserted that violent crime had risen over the prior decade – it hadn’t. Another claimed the U.S. spent 18% of the federal budget on foreign aid – the real number was less than 1%.
I expected most respondents who had received my cautionary statements to become more skeptical of these misinformed statements. On average, they did. But did overconfident respondents learn this lesson too?
Reality check: Mission accomplished
The results of the study showed that overconfident respondents began to take political falsehoods seriously only if they had experienced my “reality check” treatment first.
While overconfident respondents in other conditions showed no reaction, the humbling nature of the “reality check,” when they realized how wrong they had been, led overconfident participants in that condition to revise their beliefs. They increased their skepticism of political falsehoods by a statistically significant margin.
Overall, this “reality check” experiment was a success. But it reveals that outside of the experiment, political overconfidence stands in the way of many Americans’ ability to accurately perceive political reality.
The problem of political overconfidence
What, if anything, can be done about the widespread phenomenon of political overconfidence?
While my research cannot determine whether political overconfidence is increasing over time, it makes intuitive sense that this problem would be growing in importance in an era of online political discourse. In the online realm, it is often difficult to appraise the credibility of anonymous users. This means that false claims are easily spread by uninformed people who merely sound confident.
To combat this problem, social media companies and opinion leaders could seek ways to promote discourse that emphasizes humility and self-correction. Because confident, mistaken self-expression can easily drown out more credible voices in the online realm, social media apps could consider promoting humility by reminding posters to reconsider the “stance,” or assertiveness, of their posts.
While this may seem far-fetched, recent developments show that small nudges can lead to powerful shifts in social media users’ online behavior.
For example, Twitter’s recent inclusion of a pop-up message that asks would-be posters of news articles to “read before tweeting” caused users to rethink their willingness to share potentially misleading content.
A gentle reminder to avoid posting bold claims without evidence is just one possible way that social media companies could encourage good online behavior. With another election season soon upon us, such a corrective is urgently needed.
<
This article is republished from The Conversation under a Creative Commons license. Read the original article.