top of page
  • Frances A. Pogacar, Amira Ghenai & Others

The Positive and Negative Influence of Search Results on People’s Decisions about the Efficacy of Me


People regularly use web search engines to investigate the efficacy of medical treatments. Search results can contain documents that present incorrect information that contradicts current established medical understanding on whether a treatment is helpful or not for a health issue. If people are influenced by the incorrect information found in search results, they can make harmful decisions about the appropriate treatment. To determine the extent to which people can be influenced by search engine results, we conducted a controlled laboratory study that biased search results towards correct or incorrect information for 10 different medical treatments.

We found that search engine results can significantly influence people both positively and negatively. Importantly, study participants made more incorrect decisions when they interacted with search results biased towards incorrect information than when they had no interaction with search results at all. For search domains such as health information, search engine designers and researchers must recognize that not all non-relevant information is the same. Some non-relevant information is incorrect and potentially harmful when people use it to make decisions that may negatively impact their lives.

The Study

Our experiment had two independent variables each with two levels. The first independent variable was the search results bias with the levels: correct and incorrect. The second independent variable was the rank of the topmost correct search result with levels of 1 and 3, indicating the position of the first correct result. The experiment also had a control condition, in which no search results are presented to the user. The two independent variables with two levels each, plus a control, produces five experimental conditions. Participants had to determine the efficacy of medical treatments and a treatment could be either helpful or unhelpful. So that each of the five experimental conditions would be measured on both helpful and unhelpful treatments, we selected five of each for a total of ten treatments. The experiment had two dependent variables: 1) the fraction of correct decisions and 2) the fraction of harmful decisions made by the participant. In addition, we collected data from a questionnaire and feedback on each decision made. We also logged computer interactions for the entire study.

When people search for health information online, as 72% of U.S. internet users do, the majority are seeking information about a health issue or medical treatment [7]. While the majority of U.S. internet users are confident searchers, and believe they are finding accurate information [12], it is likely that there are many like Wei Zexi who have used a search engine for health information and have ended up making incorrect decisions that either wasted their money or negatively impacted their health. Indeed, White and Hassan [16] have shown that search engines can be biased towards incorrectly indicating that medical treatments help when they do not, and that these errors may be amplified by people’s bias towards positive information [14]. If people find and believe incorrect information regarding medical treatments, there is the potential for these people to be harmed. To measure the actual effect of search bias on people’s ability to correctly determine the efficacy of medical treatments, we conducted a controlled laboratory study with 60 participants. In our study, we biased search results towards being correct or towards being incorrect. We also controlled the topmost rank of a correct result to investigate the effect of rank.

Our study’s participants had to determine the efficacy of ten different medical treatments. We asked participants to pretend that they had a question about the effectiveness of a medical treatment and that they had decided to use a search engine to help them answer the question. For each of the ten treatments, we either presented the participants with a search results page or a control condition where they had to directly answer the question without any search results at all.

We found that:

Search results have a statistically significant, strong effect on people’s ability to make correct decisions. Results biased towards incorrect information reduced people’s accuracy from 43% to 23%. Results biased towards correct information increased accuracy from 43% to 65%.

The topmost rank of a correct result appears to have some effect on people’s accuracy. While not statistically significant, when shown results biased towards correct information, participants’ accuracy was only 59% if the top two results were incorrect compared to 70% accuracy when the rank 1 item was correct.

Knowledge of the medical treatment can perhaps inoculate people against incorrect information. We found more self-reported knowledge to reduce the effect of incorrect information on accuracy (p = 0.04).

Like White and Hassan [16], we found that participants were biased towards saying treatments were helpful. In addition, we collected information about search behaviour via a questionnaire and report on participant’s confidence in their answers and their click behaviour. Our results demonstrate that search engines have a great potential to both help and harm people. Indeed, when searchers decide that ineffective treatments will help them, they open themselves up to at best being swindled out of money and at worst being harmed by these ineffective treatments either directly or from lack of proper treatment.


When people use search engines to answer health questions, their interaction with the system has the potential for both positive and negative outcomes. When people find medical treatments or information that will prolong or improve their life, or that of a loved-one, search engines demonstrate an ability to make people’s lives better. When search engines intermix correct and incorrect information, we have shown that there is the potential for harm. In this paper, we showed that search results can significantly affect people’s decisions about the efficacy of medical treatments. Compared to not using a search engine, when people interacted with search results biased toward incorrect information, their accuracy dropped from 43% to 23%.

Thankfully, when people interact with search results biased towards correct information, their accuracy climbed to 65% (Table 3). There has long been people who prey on the hopes of others for cures to terrible diseases, and now their webpages can become intermingled with those of reputable medical organizations. For example, a search for Hoxsey Therapy, an ineffective cancer treatment [3], on today’s popular web search engines, returns a mix of results that either explain it is ineffective or explain how it can help a patient with cancer. We found that people are biased towards wanting treatments to be helpful, and this bias combined with incorrect information has the potential to cause people harm. The implications of these results extend beyond health search. Information retrieval researchers typically use curated collections.

These curated collections contain high quality and trustworthy documents. On the open web, we already know that there is spam, and we actively filter it out of web results. We now can see that web search needs more than spam filtering. Web search also needs a form of automated curation to be available to searchers so that they can have confidence in the quality of the information being provided to them. It is not enough to rely on searchers’ own media literacy to protect them from incorrect information. Likewise, information retrieval evaluation needs to expand its understanding of the effects of documents beyond graded relevance.

Non-relevant does not always mean innocuous. A document that leads a searcher to form a harmful belief about a medical treatment is damaging. A non-relevant document in today’s effectiveness measures only causes a loss of time or effort and is represented as having zero gain. An incorrect document can increase the likelihood of a searcher forming a harmful belief and undoing the value of relevant documents, i.e. an incorrect document could be perceived to have some notion of a negative gain, which to our knowledge, is a new concept in information retrieval.

[1] Alyssa Abkowitz. 2016. China Issues New Internet Search Rules Following Baidu Probe; Regulator mandates ’objective, fair and authoritative results’. Wall Street Journal (Online) (Jun 26 2016).

[2] Douglas Bates, Martin Machler, Ben Bolker, and Steve Walker. 2015. Fitting ¨ Linear Mixed-Effects Models Using lme4. Journal of Statistical Software 67, 1 (2015), 1–48. DOI:

[3] Barrie R. Cassileth and Helene Brown. 1988. Unorthodox cancer medicine. CA: A Cancer Journal for Clinicians 38, 3 (1988), 176–186.

[4] A Cipriani, TA Furukawa, and C Barbui. 2011. What is a Cochrane review? Epidemiology and psychiatric sciences 20, 03 (2011), 231–233.

[5] J. P. T. Higgins (Ed.). 2008. Cochrane Handbook for Systematic Reviews of Interventions. Vol. 5. The Cochrane Collaboration.

[6] Robert Epstein and Ronald E Robertson. 2015. The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. Proceedings of the National Academy of Sciences 112, 33 (2015), E4512–E4521.

[7] Susannah Fox and Maeve Duggan. 2013. Health Online 2013. Pew Research Center. (2013).

[8] Yvonne Kammerer, Ivar Braten, Peter Gerjets, and Helge I Strømsø. 2013. The ˚ role of Internet-specific epistemic beliefs in laypersons’ source evaluations and decisions during Web search on a medical issue. Computers in Human Behavior 29, 3 (2013), 1193–1203.

[9] Juhi Kulshrestha, Motahhare Eslami, Johnnatan Messias, Muhammad Bilal Zafar, Saptarshi Ghosh, IIEST Shibpur, India Krishna P Gummadi, and Karrie Karahalios. 2017. Quantifying Search Bias: Investigating Sources of Bias for Political Searches in Social Media. In Proc. of CSCW.

[10] Yadan Ouyang. 2016. Student’s death highlights gaps in China’s health regulations. Lancet Oncology 17, 6 (2016), 709.

[11] Bing Pan, Helene Hembrooke, Thorsten Joachims, Lori Lorigo, Geri Gay, and Laura Granka. 2007. In Google We Trust: Users’ Decisions on Rank, Position, and Relevance. Journal of Computer-Mediated Communication 12, 3 (2007), 801–823.

[12] Kristen Purcell, Joanna Brenner, and Lee Rainie. 2012. Search Engine Use 2012. Pew Research Center. (2012).

[13] R Core Team. 2014. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project. org/

[14] Ryen White. 2013. Beliefs and biases in web search. In SIGIR. ACM, 3–12.

[15] Ryen W White. 2014. Belief dynamics in Web search. Journal of the Association for Information Science and Technology 65, 11 (2014), 2165–2178.

[16] Ryen W White and Ahmed Hassan. 2014. Content bias in online health search. ACM Transactions on the Web (TWEB) 8, 4 (2014), 25.

[17] Ryen W White and Eric Horvitz. 2015. Belief dynamics and biases in web se

Screenshot 2023-11-06 at 13.13.55.png
bottom of page