The data: More than half (55%) of US consumers turn to ChatGPT to better understand a diagnosis after a doctor’s appointment, according to a February Tebra survey of 803 Americans, highlighting the general-purpose AI tool’s role in health queries.
Why it matters: As the most widely used AI chatbot, ChatGPT is gaining traction for personal health information and advice, though trust is still evolving.
Still, overall trust is nascent. While 45% of consumers say they trust ChatGPT at least somewhat for health advice, just 2% completely trust it, and 21% mostly trust it. When asked which they trust more, 74% say they trust their doctor’s advice over ChatGPT’s.
Implications for AI platforms: Consumers are becoming more comfortable sharing personal health data with general AI chatbots, even though those platforms fall outside privacy regulations such as HIPAA.
Despite the rise of specialized tools within healthcare systems like Hartford Health’s PatientGPT and from health tech platforms like Amazon One Medical, people are unlikely to stop using general chatbots for health questions.
As trust and reliability of AI outputs grow, platforms can safely assume that people will keep uploading personal questions, lab results, and sensitive medical information. To avoid patient harm—and potential legal risk if adverse outcomes emerge—general-purpose AI tools will need stronger guardrails, testing, and escalation paths around health use, not just disclaimers saying they’re not substitutes for professional advice.
This content is part of EMARKETER’s subscription Briefings, where we pair daily updates with data and analysis from forecasts and research reports. Our Briefings prepare you to start your day informed, to provide critical insights in an important meeting, and to understand the context of what’s happening in your industry. Not a subscriber? Click here to get a demo of our full platform and coverage.
You've read 0 of 2 free articles this month.
685 Third Avenue21st FloorNew York, NY 100171-800-405-0844
1-800-405-0844[email protected]