A new analysis of scientific practices suggests that a researcher’s personal political views may influence the results they obtain when analyzing complex data. The study provides evidence that when experts act independently to answer the same question using the same dataset, their conclusions tend to align with their pre-existing ideological beliefs. These findings were published in the journal Science Advances.
The investigation was conducted by George J. Borjas and Nate Breznau. Borjas is a Cuban-American economist who serves as the Robert W. Scrivner Professor of Economics and Social Policy at the Harvard Kennedy School. Breznau serves as the principal investigator at the German Institute for Adult Education – Leibniz Institute for Lifelong Learning.
The motivation for the study arose from a previous large-scale experiment known as the Crowdsourced Replication Initiative. In that original project, independent research teams were given identical data to answer a specific sociological question. Borjas examined the publicly available data from that initiative and noticed a correlation between the researchers’ stated opinions on immigration and the statistical results they produced.
“It is clear that there are many reasons that scholars may be influenced by different forms of bias (confirmation bias, publication bias, status seeking, etc.). It is only logical that scholars might want to arrive at certain results that align with their own ideological preferences – how they want the world to be or how they want the world to appear,” Breznau told PsyPost.
“I personally am interested in the reproducibility crisis and in seeking ways to address a serious lack of reproducibility in science. Consider for example when Daryl Bem found ‘evidence’ of extrasensory perception (ESP). He personally believed in ESP. This does not appear to be a coincidence. All efforts to replicate his experiments so far failed.”
However, Breznau was initially skeptical of Borjas’s observation. He suspected that the statistical association was likely a coincidence or a fluke that would disappear under more rigorous testing. He assumed that the connection would not hold up if different statistical models were applied. To test this, the authors decided to conduct a comprehensive analysis to determine if political ideology truly played a role in how the scientists designed their research and interpreted their findings.
The study utilized data from 158 researchers organized into 71 separate teams. These teams had participated in an experiment where they were asked to determine whether immigration affects public support for social welfare programs. The researchers were provided with data from the International Social Survey Program, covering various countries and spanning the years 1985 to 2016.
Before the teams began their analysis, they completed a survey. One of the questions asked for their stance on immigration policy. Specifically, they were asked if laws on immigration should be relaxed or made tougher. Their responses were recorded on a scale ranging from zero to six.
The teams then proceeded to analyze the data. They were tasked with replicating a well-known previous study that found no link between immigration and welfare support. After replicating that study, the teams were instructed to extend the research using the new data provided. They had the freedom to choose their own statistical methods and variables to test the hypothesis.
Collectively, the 71 teams estimated 1,253 distinct statistical models. The results varied significantly. Some teams concluded that immigration strongly decreased public support for social programs. Other teams found that immigration strongly increased such support. Many others found no significant effect at all.
Borjas and Breznau found a systematic pattern in this variation. Teams composed of researchers who favored clearer immigration policies tended to produce results suggesting that immigration had a positive effect on social cohesion. Teams composed of researchers who favored tougher immigration laws tended to produce results showing a negative effect.
The authors sought to understand the mechanism behind this divergence. They found that the difference was not due to errors in calculation. Instead, it stemmed from the specific choices the teams made when designing their statistical models. In the social sciences, researchers often have to make many decisions about how to organize data.
For example, researchers must decide how to measure immigration levels. They can measure it as the total percentage of foreign-born residents, or they can measure it as the rate of new arrivals per year. They must also decide which countries to include in the comparison and which specific years to analyze. They also have to decide how to mathematically group different types of social welfare programs.
The study identified five specific research design decisions that heavily influenced the final results. These decisions accounted for approximately 68 percent of the difference in findings between the pro-immigration and anti-immigration teams. The analysis showed that teams tended to select the specific combination of data points and measurement tools that produced results consistent with their ideological preferences.
To ensure their findings were robust, Breznau and Borjas conducted a “multiverse analysis.” This involved running 883 different statistical models to test the link between ideology and research outcomes. They found that in nearly 88 percent of these models, the effect of ideology was statistically significant. This convinced Breznau that the correlation he originally doubted was indeed real.
The study also examined the quality of the research produced by the different teams. In the original experiment, each team’s research design was reviewed by other participants in a double-blind process. The reviewers did not know who authored the studies or what their political views were.
The analysis revealed that teams with strong ideological views, whether pro-immigration or anti-immigration, received lower scores from their peers. Teams that held moderate views on immigration tended to design models that received higher ratings for quality. This suggests that widely accepted research standards were more often met by researchers who did not hold extreme political views on the topic.
“It is important to remember that scientists are also human beings,” Breznau said. “Their complex brains weigh simultaneously all kinds of factors in determining how to think and behave. They are not infallible and are not perfectly objective in their work.”
“That is why it is important to build in safeguards into the scientific process, like having others check each others’ work and using more methods to ensure robustness – something we have taken extra care here to do by running a multiverse of models, nearly all of which show the correlation that George initially found and therefore provide robust evidence of an effect.”
The authors caution that their study, like all research, has certain limitations. The original experiment was not specifically designed to test for ideological bias, so the evidence is exploratory rather than confirmatory. The number of researchers who openly admitted to anti-immigration views was small compared to those with pro-immigration views. This imbalance makes it difficult to draw definitive conclusions about the magnitude of bias on the anti-immigration side.
There is also the possibility of social desirability bias in the survey responses. Researchers might have been hesitant to express anti-immigration sentiments in an academic environment. This could mean some teams classified as moderate or pro-immigration actually contained members with different private views.
The authors also note that they cannot observe the internal thought processes of the researchers. It is unclear if the teams consciously chose models that fit their biases or if the process was unconscious. Researchers might simply stop looking for errors or alternative models once they find a result that makes sense to them.
Future research could benefit from observing the scientific workflow in real-time. Tracking every decision a researcher makes could illuminate exactly when and how ideology enters the process. This would require an experimental setting where every step of data analysis is recorded.
Anticipating that their conclusions might be scrutinized, Breznau emphasized that the authors employed exhaustive checks to ensure they were not guilty of the very selection bias they were investigating.
“There are some politically motivated responses to our work,” he said. “This is dangerous. We have gone to great lengths to ensure the robustness of our findings. We, being aware that ideology could bias any researcher including ourselves, have modeled nearly every possible alternative model specification. Therefore, we have proven beyond a high bar, that we have not selected (p-hacked) our results from many other plausible models.”
“I personally am a strong supporter of open science and metascience,” Breznau added. “I consider myself a member of the Open Science Movement. I have learned much from psychology in this regard. I will continue to support open, transparent, inclusive and reproducible scientific methods.”
The study, “Ideological bias in the production of research findings,” was authored by George J. Borjas and Nate Breznau.
Leave a comment
You must be logged in to post a comment.