Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

Apple’s child exploitation dilemma isn’t going away anytime soon

The news: A global coalition of more than 90 policy and civil rights organizations are calling on Apple CEO Tim Cook to put the brakes on Apple’s child sexual abuse media (CSAM) scanning tool. The groups—which include the ACLU, the Electronic Frontier Foundation, and the Center for Democracy and Technology—urged the CEO to “abandon” the company’s plans, citing fears that the tools could be used to “censor protected speech, [and] threaten the privacy and security of people around the world.”

The letter also claims that algorithms designed to detect sexually explicit material are “notoriously unreliable,” often mistakenly flagging art, educational resources, and other imagery as sexual content.

The clock is ticking: The tools are part of the iOS 15 update, which is expected to roll out next month.

How we got here: Apple’s CSAM prevention efforts drew prompt and fervid criticism from privacy advocates almost as soon as they were announced. Privacy advocates worry the features could weaken Apple’s encryption, or be used by governments to quash political dissenters.

  • In response, Apple released multiple documents detailing safeguards it claims it’s putting in place.
  • Apple is also reportedly facing growing internal criticism from its own employees.
  • Meanwhile, last week, a group of researchers claimed they had reverse-engineered an early version of the CSAM filter code and were able to trick it into mislabeling images. Apple has disputed the findings, per Vice.

Why this matters: Prolonged interrogations of Apple’s privacy commitments could damage one of its key competitive advantages: consumer privacy.

  • 24% of users who switched from Android to an iPhone in the past five years said they thought Apple was safer.
  • 18% said they switched because they wanted devices with stronger privacy protections, per Consumer Reports.

The bigger picture: Continued pressure from advocacy groups deflates any hope Apple may have had of the issue fizzling out.

Instead, Apple’s decision to stand by the tool—or even withdraw it —could be a watershed moment for the company, and an inflection point for consumer privacy writ large.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!