Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

Character AI cuts off chatbot access for minors amid safety push

The news: Amid pressure to establish better child safety guardrails in the AI industry, Character AI will block users under 18 from chatting with bots on its platform starting November 25.

  • For now, Character AI will evaluate users’ age based on the type of character they choose to chat with and impose a two-hour time limit for those flagged as under 18.
  • After the ban, minors will still be able to generate photos and images—with safety limits in place—and review prior chats.

Why it matters: Rather than limiting engagement time or restricting what users can see—methods used by digital platforms like YouTube and Roblox—Character AI is taking a blanket approach. This change could also alter ad targeting on the platform.

Character AI’s decision could affect advertisers’ ability to reach Gen Alpha, a demographic with high potential for spending and engagement.

“We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them,” Character AI CEO Karandeep Anand said, per The New York Times.

Zooming out: This change may protect users’ well-being, but it also preserves brand reputation and addresses impending regulatory demands.

  • Regulators are tightening child safety standards, and Character AI is currently facing a lawsuit over a young user who died by suicide after engaging heavily with the platform.
  • California Gov. Gavin Newsom signed a bill this month requiring AI to include safety guardrails on chatbots. That will take effect in January 2026.

What it means for advertisers: As regulatory and safety concerns rise, advertisers face greater scrutiny when looking to reach younger audiences. CMOs should assess brand alignment with AI platforms and, on platforms that have the potential to pose a safety risk for children, shift efforts to target adult users.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!