Trust in AI Search: What Drives Confidence?

Trust in AI Search: What Drives Confidence?

First large-scale experiment measuring how design influences human trust in GenAI search results

This groundbreaking study conducted ~12,000 search queries across seven countries to understand what makes humans trust AI-generated search results.

Key findings:

  • Adding citations to external sources increased trust by 11.3% while adding confidence indicators boosted trust by 5.5%
  • Design elements influenced trust more than the actual factual accuracy of the information
  • Users with lower digital literacy showed higher vulnerability to trusting potentially inaccurate AI-generated content
  • Trust varied significantly across different search domains and countries

These insights are crucial for security professionals as they highlight how design choices in AI systems can create vulnerabilities to misinformation and influence user decision-making at scale.

Human Trust in AI Search: A Large-Scale Experiment

129 | 141