Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

OpenAI defends ChatGPT amid lawsuits over mental health harms

The news: OpenAI has refuted legal claims that ChatGPT is at fault for a teenager’s recent suicide.

In August, the parents of a 16-year-old boy sued OpenAI and its CEO Sam Altman after their son died by suicide, alleging ChatGPT helped him explore suicide methods. The teen expressed suicidal thoughts to the chatbot, and according to the lawsuit, logs show ChatGPT neither ended the session nor initiated emergency protocols that couldn’t be easily bypassed by simple workarounds like role-playing as a fictional character.

OpenAI’s response: In a recent court filing, OpenAI alleged that the teen’s “misuse” of ChatGPT contributed to his harm.

The company noted that chatbot users under age 18 need parental consent, and that users are prohibited from using ChatGPT for suicide or “self-harm,” and aren’t allowed to bypass ChatGPT’s protective measures, per NBC News. ChatGPT’s terms of use also state that users should not rely on outputs “as a sole source of truth or factual information, or as a substitute for professional advice.” In a November 25 blog post, OpenAI expressed sympathy for the family, while noting that the “complaint included selective portions of his [the teenage boy’s] chats that require more context.”

OpenAI pointed to new safeguards it’s put in place for ChatGPT, including making connections to emergency services, restrictions for users under 18, and parental controls that allow parents to link accounts to teens and get notifications if the system detects their teen is in “acute distress.”

OpenAI is also facing a wave of new lawsuits, which allege that ChatGPT contributed to suicide or caused psychological harm. OpenAI’s own analysis shows that 0.15% of users have conversations that include “explicit indicators of potential suicidal planning or intent” in a given week. Considering 800 million people use ChatGPT every week, potentially 1.2 million users express suicidal thinking on ChatGPT, per Wired.

Why it matters: States are tightening rules on AI usage in mental healthcare, but general-purpose chatbots like ChatGPT remain far tougher to regulate.

  • Earlier this year, Illinois and Nevada enacted laws prohibiting anyone (such as a tech/AI company) from using AI to deliver direct therapy or make therapeutic decisions.
  • Other states have passed laws limiting the use of AI in mental health services without going as far as an outright ban.
  • A range of health tech companies, including Lyra Health, Slingshot AI, and Character AI offer some form of AI therapy services.

Implications for AI companies: Regardless of how the lawsuits against OpenAI end, scrutiny of AI tools being used for emotional and therapeutic support will only intensify. Both general-purpose platforms and specialized healthcare AI tools should proactively take action to impose age restrictions, automatically end sessions at the first sign of emotional distress, and clearly direct users to mental health resources when appropriate.

This content is part of EMARKETER’s subscription Briefings, where we pair daily updates with data and analysis from forecasts and research reports. Our Briefings prepare you to start your day informed, to provide critical insights in an important meeting, and to understand the context of what’s happening in your industry. Not a subscriber? Click here to get a demo of our full platform and coverage.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!