The news: A CBS investigation discovered hundreds of deepfake ads on Meta platforms promoting “nudify” apps that create sexually explicit content based on images of real people.
- The ads were discovered in Instagram’s Stories feature and promoted AI tools marketing the ability to “upload a photo” and “see anyone naked,” per the investigation. Other Stories ads showed users how to manipulate videos of real people to create nude deepfakes.
- The analysis of Meta’s ad library found at minimum hundreds of deepfake ads across Facebook, Instagram, Threads, Facebook Messenger, and Meta Audience Network.
- A Meta spokesperson claimed that the company is working to strengthen its measures against deepfake ads as individuals promoting them employ increasingly sophisticated methods to evade detection—but ads for deepfake tools were still up on Instagram even after Meta removed the flagged ads.
The deepfake scare: The rise of sophisticated AI technologies like deepfakes is a major concern among consumers, requiring brands to proactively manage where and how content appears to avoid unintended associations.