In an era of deep polarization over everything from public health measures to climate policy, a new study offers a compelling explanation for why people struggle to change their minds—even when they turn to search engines seeking information.

Published in the Proceedings of the National Academy of Sciences, research from Tulane University demonstrates that the very act of searching online may inadvertently reinforce existing beliefs rather than challenge them. The phenomenon, which the researchers call the “narrow search effect,” occurs because people tend to phrase their queries in ways that align with what they already believe—and search algorithms, designed to maximize relevance, dutifully serve up results that confirm those preconceptions.

“When people look up information online—whether on Google, ChatGPT or new AI-powered search engines—they often pick search terms that reflect what they already believe (sometimes without even realizing it).” — Eugina Leung, Tulane University

The implications extend far beyond individual internet habits. As Americans find themselves increasingly divided not merely over policy choices but over fundamental perceptions of factual reality, the study suggests that search engines—once heralded as democratizing tools for accessing information—may actually be contributing to belief polarization across political, health, economic, and environmental domains.

Lead author Eugina Leung, an assistant professor at Tulane’s A. B. Freeman School of Business, and co-author Oleg Urminsky of the University of Chicago’s Booth School of Business conducted 21 experiments involving nearly 10,000 participants. They tested how people search for information on topics ranging from caffeine’s health effects to nuclear energy, crime rates, cryptocurrency, and COVID-19—using platforms including Google, ChatGPT, and AI-powered Bing.

The results were consistent across domains and platforms. Someone who believes caffeine is healthy tends to search “benefits of caffeine,” while a skeptic might type “caffeine health risks.” These subtle differences in query phrasing steered users toward vastly different search results, ultimately reinforcing their original positions.

Critically, this effect persisted even when participants had no conscious intention of confirming their biases. In several studies, fewer than ten percent of participants admitted to deliberately crafting searches to validate their existing beliefs, yet their search behavior still closely aligned with those beliefs.

The study builds on decades of psychological research documenting confirmation bias—the human tendency to favor information that reaffirms one’s existing views. Classic studies have shown that people tend to formulate questions designed to elicit affirming responses and pay more attention to evidence that aligns with their beliefs while dismissing contradictory information.

What makes the current research particularly significant is its documentation of how modern search technology amplifies this ancient cognitive tendency. Previous research on filter bubbles and echo chambers has focused primarily on how algorithms target users with personalized content. This study demonstrates that echo chambers persist even when search providers do not differentially target users—simply because of how people phrase their queries.

“Because today’s search algorithms are designed to give you ‘the most relevant’ answers for whatever term you type, those answers can then reinforce what you thought in the first place. This makes it harder for people to discover broader perspectives.” — Eugina Leung

The researchers found that user-focused interventions—such as prompting people to conduct additional searches or consider alternative perspectives—had limited effectiveness in countering the narrow search effect. What did work was changing the algorithm itself.

When search tools were programmed to return a broader range of results—regardless of how narrowly the query was phrased—participants were more likely to update their beliefs. In experiments using custom-designed search interfaces, people who saw balanced results about caffeine’s health effects developed more moderate views and showed greater openness to behavioral change. Notably, participants rated the broader search results as equally useful and relevant as narrowly tailored ones.

The findings suggest a promising path forward for technology design. The researchers found that 84 percent of survey participants expressed interest in using a “Search Broadly” button—a feature that would intentionally deliver diverse perspectives on a topic, functioning as the conceptual opposite of Google’s “I’m Feeling Lucky” button.

“Because AI and large-scale search are embedded in our daily lives, integrating a broader-search approach could reduce echo chambers for millions (if not billions) of users.” — Eugina Leung

As artificial intelligence increasingly mediates how people access information, the stakes continue to rise. The study’s authors emphasize that when information-provision technology not only focuses on relevance but also broadens horizons, individuals access more thorough information, fostering better-informed beliefs and decisions. By creating environments with more shared factual understanding, broader search algorithms could play a significant role in mitigating belief polarization and contributing to a more cohesive society.

Endnotes

1. Leung, E., & Urminsky, O. (2025). The narrow search effect and how broadening search promotes belief updating. Proceedings of the National Academy of Sciences, 122(13), e2408175122. https://doi.org/10.1073/pnas.2408175122

2. Tulane University. (2025, May 27). The silent force behind online echo chambers? Your Google search [Press release]. EurekAlert! https://www.eurekalert.org/news-releases/1085358

3. Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Books.

4. Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37, 2098-2109.

5. Smith, E. K., Bognar, M. J., & Mayer, A. P. (2024). Polarisation of climate and environmental attitudes in the United States, 1973–2022. npj Climate Action, 3, 1-14.

Leave a Reply

Trending

Discover more from Scientific Inquirer

Subscribe now to keep reading and get access to the full archive.

Continue reading