Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

New mental health safeguards for ChatGPT are working, OpenAI says

The news: OpenAI detailed new ChatGPT mental health safety measure results on Monday, alongside an internal analysis that shows potentially millions of users’ conversations indicate emotional reliance on the chatbot.

Digging into the data: OpenAI’s analysis estimates 0.15% of users have conversations with ChatGPT that indicate heightened emotional attachment, while another 0.15% of users have conversations that include “explicit indicators of potential suicidal planning or intent” in a given week.

  • An estimated 0.07% of users show possible signs of mental health issues related to psychosis or mania, per OpenAI, although it also cautioned that the conversations are rare and difficult to detect and measure.
  • More than 800 million people use ChatGPT every week, per Wired. By Wired’s tracking, about 2.4 million people are possibly expressing suicidal thinking on ChatGPT or prioritizing the chatbot over real-life loved ones, school, or work.

Zooming out: OpenAI initially announced new safeguards for people showing signs of mental health crises. And the results posted this week show improved responses and fewer “undesired” answers in ChatGPT-5. Results were evaluated by more than 170 participating psychiatrists, psychologists, and primary care physicians.

  • Across mental health conversations, ChatGPT-5 reduced undesired answers between 39% and 52% compared with GPT-4o, per the experts’ assessment.

Why it matters: With ongoing shortages of mental healthcare providers and rising healthcare costs, budget-conscious consumers are turning to AI chatbots for informal therapy.

  • Nearly 1 in 4 (23.4%) US adults experienced a mental illness in the past year, per a September report from Mental Health America.
  • Almost half (49%) of large language model (LLM) users who self-reported a mental health condition use chatbots including ChatGPT, Claude, and Gemini for mental health support, per a February 2025 Sentio University survey of 499 US adults with ongoing mental health conditions and who have used LLMs.
  • 90% of those same respondents cited accessibility, while 70% cited affordability as the leading reasons for using chatbots for mental health support.

Our take: Retroactive mental health and wellness safeguards are necessary in AI chatbots, and OpenAI’s new safety updates show progress. But they’re only one part of the ecosystem. Healthcare and tech companies also need to adopt more monitoring and clinician oversight for people already using these tools. Marketers should educate parents of teens and young adults about safe AI use, and emphasize best practices like clinician collaboration, backup safety measures, and transparent data policies.

This content is part of EMARKETER’s subscription Briefings, where we pair daily updates with data and analysis from forecasts and research reports. Our Briefings prepare you to start your day informed, to provide critical insights in an important meeting, and to understand the context of what’s happening in your industry. Not a subscriber? Click here to get a demo of our full platform and coverage.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!