Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

OpenAI walks back feature that shared private user chats in Google Search

The news: OpenAI abruptly discontinued a ChatGPT “share” feature after widespread criticism of its opt-in functionality surfaced thousands of unintended private chats in Google Search results.

If a user checked a box to “make this chat discoverable” (sometimes accidentally or not fully understanding the warning), Google and other search engines could “see” these chat links and add them to public search results. This issue could have affected users of both free and paid ChatGPT accounts—including those with Plus, Pro, and Team (business) subscriptions.

“Ultimately this feature introduced too many opportunities for folks to accidentally share things they didn’t intend to, so we’re removing the option [and] working to remove indexed content from the relevant search engines,” OpenAI CISO Dane Stuckey posted on X.

Why this matters: Google indexed around 100,000 private ChatGPT chats containing personal information, health questions, and professional content, per 404 Media

This marks the third major user privacy leak in recent years. Google Bard conversations appeared in search results in 2023, while Meta AI users inadvertently posted private chats to public feeds, per the BBC

Data privacy and security are the top concerns of 73% of C-level executives worldwide, per BearingPoint, and they’re right to be worried. Despite being an opt-in feature, the leak could have affected businesses sharing ChatGPT accounts internally.

Actionable insights for businesses: For marketing decision-makers, this pattern demands urgent vendor due diligence. If consumer AI products fail basic privacy controls, corporate applications handling sensitive business data face exponentially higher risks.

Where to start: 

  • Externally: Demand transparency from AI vendors. Clarify data governance, retention policies, and third-party access controls. 
  • Internally: Strengthen AI governance by auditing tools for privacy risks and restrict sensitive data sharing. Ensure all teams are fully training on AI use.

Yes, but: The frenetic pace of AI innovation and the glut of new tools like chatbots, AI agents, and APIs make locking down AI’s security difficult. 

Our take: The AI industry’s “move fast and break things” ethos is clashing with the non-negotiable demands of data protection. For marketers reliant on AI for strategic planning and analysis, security and data privacy are paramount. Companies demonstrating a strong security focus could stand out from competitors.

You've read 0 of 2 free articles this month.

Create an account for uninterrupted access to select articles.
Create a Free Account