ChatGPT retrieves far more webpages than it cites. A new AirOps analysis found that 85% of discovered sources never appear in the final answer.

Why we care. If you want your content cited in AI-generated answers, discovery isn’t enough. Most retrieved pages never become visible to users.

Key finding. In AI answers, retrieval doesn’t equal citation. Your page can rank and be retrieved yet still lose the citation to a source that better matches the prompt or supporting context.

  • This shifts optimization toward earning selection inside the AI synthesis process—not just appearing in search results, per the report.

By the numbers:

  • 82,108 citations appeared in final responses.
  • Only 15% of retrieved pages were cited.
  • 85% of pages surfaced during research never appeared in answers.

Citation rates also varied by query type:

  • 18.3% for product discovery queries
  • 16.9% for how-to queries
  • 11.3% for validation searches

Fan-out queries. ChatGPT often expands prompts with additional internal searches while generating an answer, creating what the report calls a “second citation surface.” Across the dataset:

  • 89.6% of prompts triggered two or more follow-up searches.
  • Fan-out searches expanded 15,000 prompts into 43,233 queries.
  • 32.9% of cited pages appeared only in fan-out results—not the original prompt.
  • 95% of fan-out queries had zero traditional search volume.

Google ranking correlation. High Google rankings strongly correlated with citations:

  • 55.8% of cited pages ranked in Google’s top 20.
  • Pages ranking in Position 1 were cited 3.5 times more often than pages outside the top 20.

About the data. AirOps analyzed 548,534 pages retrieved across 15,000 prompts to examine how ChatGPT expands queries and selects citations.

The study. The Influence of Retrieval, Fan-out, and Google SERPs on ChatGPT Citations


Search Engine Land is owned by Semrush. We remain committed to providing high-quality coverage of marketing topics. Unless otherwise noted, this page’s content was written by either an employee or a paid contractor of Semrush Inc.


Danny GoodwinDanny Goodwin

Danny Goodwin is Editorial Director of Search Engine Land & Search Marketing Expo – SMX. He joined Search Engine Land in 2022 as Senior Editor. In addition to reporting on the latest search marketing news, he manages Search Engine Land’s SME (Subject Matter Expert) program. He also helps program U.S. SMX events.

Goodwin has been editing and writing about the latest developments and trends in search and digital marketing since 2007. He previously was Executive Editor of Search Engine Journal (from 2017 to 2022), managing editor of Momentology (from 2014-2016) and editor of Search Engine Watch (from 2007 to 2014). He has spoken at many major search conferences and virtual events, and has been sourced for his expertise by a wide range of publications and podcasts.