Consumers turn to ChatGPT for health advice, but trust remains nascent

The data: More than half (55%) of US consumers turn to ChatGPT to better understand a diagnosis after a doctor’s appointment, according to a February Tebra survey of 803 Americans, highlighting the general-purpose AI tool’s role in health queries.

  • 41% seek a second opinion based on ChatGPT’s advice.
  • 20% say they follow ChatGPT’s advice over their own doctor’s recommendations.

Why it matters: As the most widely used AI chatbot, ChatGPT is gaining traction for personal health information and advice, though trust is still evolving.

  • More than one-third (36%) say they would be more comfortable asking ChatGPT than their doctor about sexual health questions.
  • 32% would prefer ChatGPT over their doctor to explain confusing test results.
  • 29% are more comfortable talking about their mental health with ChatGPT than with their doctor.

Still, overall trust is nascent. While 45% of consumers say they trust ChatGPT at least somewhat for health advice, just 2% completely trust it, and 21% mostly trust it. When asked which they trust more, 74% say they trust their doctor’s advice over ChatGPT’s.

Implications for AI platforms: Consumers are becoming more comfortable sharing personal health data with general AI chatbots, even though those platforms fall outside privacy regulations such as HIPAA.

  • 17% of consumers have uploaded lab results or bloodwork to ChatGPT, while 11% have uploaded prescription information, medical histories, or X-ray and skin images, per Tebra.

Despite the rise of specialized tools within healthcare systems like Hartford Health’s PatientGPT and from health tech platforms like Amazon One Medical, people are unlikely to stop using general chatbots for health questions.

As trust and reliability of AI outputs grow, platforms can safely assume that people will keep uploading personal questions, lab results, and sensitive medical information. To avoid patient harm—and potential legal risk if adverse outcomes emerge—general-purpose AI tools will need stronger guardrails, testing, and escalation paths around health use, not just disclaimers saying they’re not substitutes for professional advice.

This content is part of EMARKETER’s subscription Briefings, where we pair daily updates with data and analysis from forecasts and research reports. Our Briefings prepare you to start your day informed, to provide critical insights in an important meeting, and to understand the context of what’s happening in your industry. Not a subscriber? Click here to get a demo of our full platform and coverage.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!