As political polarization deepens in the United States, the language people use to discuss politics online is increasingly reflecting exaggerated, black-and-white thinking. A recent analysis of millions of social media posts reveals that markers of mental distortions rose alongside political extremism between the 2016 and 2020 presidential elections. The research, published in Communications Psychology, highlights a growing overlap between extreme ideological views and the rigid thought patterns often addressed in psychological therapy.
Psychologists use the term cognitive distortions to describe thought patterns “wherein individuals think about themselves, the future, and the world in inaccurate and overly negative ways.” These habits include overgeneralizing, catastrophizing, and viewing situations in absolute terms. For example, if a high school student fails a single test and immediately decides their entire academic future is ruined, they are catastrophizing. If a person assumes a peer ignored them in the hallway out of malice rather than distraction, they are engaging in mindreading.
In clinical settings, mental health professionals target these distortions through treatments like Cognitive Behavioral Therapy. Recognizing and adjusting these rigid beliefs helps patients break negative mental habits. By teaching individuals to replace absolutist thoughts with objective facts, therapists help patients manage emotional disorders like depression and anxiety.
The ways people express extreme political views often mirror these same psychological habits. Partisans might attach sweeping, negative labels to their political opponents. They might also make dire, baseless predictions about the future of the country if a specific candidate wins an election. To explore whether these two phenomena are connected, a team of researchers analyzed the digital language habits of voters.
Andy Edinger, a researcher at Indiana University, led the investigation alongside colleagues from Indiana University and the City College of New York. They wanted to understand what it actually means to think in polarized terms. The team sought to determine if the psychological concepts used in clinical therapy could help explain the rising ideological divisions observed in recent public discourse.
To conduct the analysis, Edinger and his colleagues examined massive collections of posts from the social media platform Twitter, now known as X. The data included messages discussing the presidential candidates in the weeks leading up to the 2016 and 2020 elections. The researchers specifically focused on a core group of nearly 100,000 users who actively posted during both election cycles. This shared group allowed the team to track how the behavior of individual people changed over a four-year period.
To measure distorted thinking, the researchers applied a specialized language analysis tool originally developed by mental health experts. The tool scans text for a recognized dictionary of 241 specific word sequences. These phrases act as markers for different types of cognitive distortions. By tallying how frequently these phrases appeared in a user’s posts, the team calculated an overall prevalence score for distorted language.
The team also needed a mathematical way to measure political ideology and polarization. They accomplished this by analyzing the social networks of the users and the content they chose to share. By mapping which political influencers an individual consistently echoed through retweets, the researchers estimated both the user’s political leaning and their degree of ideological extremism.
The results pointed to a broad and marked change in the way social media users communicated. Between 2016 and 2020, the average prevalence of distorted language across the entire group of users increased by more than 43 percent. When looking at the changes within individual accounts, the average user exhibited a 76 percent increase in the prevalence of posts containing at least one cognitive distortion marker.
This escalation was not limited to just one or two bad mental habits. The increases persisted across every category of cognitive distortion measured by the study. The categories that experienced the greatest surges included emotional reasoning, overgeneralizing, catastrophizing, and mindreading. In the context of political debate, mindreading often takes the form of assuming an opposing voter holds secret, malicious intentions.
The researchers then looked at how these changes related to political ideology. They found that users who became more politically isolated and extreme between 2016 and 2020 also demonstrated higher rates of distorted language. As political polarization intensified, the reliance on rigid, exaggerated language grew in tandem.
The data revealed somewhat different patterns for users on different ends of the political spectrum. Among left-leaning individuals, there was a steady and visible relationship between their level of ideological extremism and their use of distorted language. As their views grew more polarized over the four years, their use of cognitive distortions increased at a matching pace.
Right-leaning users exhibited a different trajectory. In 2016, users on the political right already displayed an initially higher baseline rate of distorted language compared to users on the left. Because they started at a higher point, their continued slide into extreme polarization had a less pronounced effect on their likelihood of using cognitive distortions in 2020. Based on the researchers’ models, this might represent a saturation effect, where the initial language was already highly rigid.
The team also explored the timeline of these changes to see which behavior usually manifested first. They found that individuals who heavily used distorted language in 2016 were highly likely to become more politically polarized by 2020. In contrast, being highly polarized in 2016 was not a strong predictor of adopting newly distorted language in 2020. This dynamic indicates that rigid, black-and-white thought patterns might actively fuel ideological divides over time rather than just reflecting them.
While the timeline suggests a directional relationship, the authors note that the findings are strictly correlational. Observational data from a single social media platform cannot definitively confirm that cognitive distortions cause political polarization. Additionally, changes in the moderation policies and algorithms of the website itself during those four years might have influenced what kind of content was promoted or suppressed.
The researchers also clarify that their analysis does not mean politically vocal people are experiencing clinical mental health disorders. The study measures a specific style of communicating and thinking rather than diagnosing depression or anxiety. The relationship between actual mental illness and political behavior remains a separate area of ongoing research.
Despite these limitations, the research offers a new way of looking at societal fractures. The authors point to existing theories suggesting that modern digital environments might be unintentionally teaching people to internalize the very thought patterns therapists try to cure. If society is adopting a backward version of Cognitive Behavioral Therapy, the consequences could extend beyond individual stress to threaten broader democratic institutions.
Moving forward, the team hopes to incorporate data from a wider variety of digital platforms. Exploring these thought patterns in different online environments could help confirm the long-term trends seen in this study. Recognizing these psychological habits in public discourse might eventually aid in the development of targeted interventions to reduce hostility in online spaces.
The study, “Cognitive distortions are associated with increasing political polarization,” was authored by Andy Edinger, Johan Bollen, Hernán A. Makse, and Matteo Serafino.
Leave a comment
You must be logged in to post a comment.