Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

Apple employees worry misuse of child abuse scanning tools could hurt the company’s privacy reputation

The news: Apple is facing growing criticism from its own employees over the rollout of its controversial new child sexual abuse media (CSAM) scanning tools, according to Reuters.

  • Reuters discovered the dissenters speaking out on an Apple slack channel where they sent more than 800 messages, some expressing concern that authoritarian governments could use the company’s new hashing feature to scan for images not limited to CSAM material—which could lead to censorship.
  • Other dissenters expressed fear that rolling out the features could damage Apple’s hard-won reputation as a bastion of user privacy.

Apple claims safeguards will prevent misuse: Apple’s recently released paper provides details on safeguards it is putting in place around the scanning feature, claiming it will only flag potential images found in child safety databases from multiple governments. Apple claims this diffused approach will prevent any one government from manipulating the system by covertly adding in non-CSAM material to its own database.

But the fallout continues: News of Apple’s CSAM prevention efforts drew quick and fervid criticism from privacy advocates, who worry the features could weaken Apple’s encryption or be used by governments to quash political dissenters.

  • In response, Apple published a six-page FAQ providing more details on how the new suite of CSAM prevention features work, and said it would refuse demands from governments to add non-CSAM images to its scanning tool.
  • Over the weekend, Apple’s senior vice president of software engineering Craig Federighi defended the new features in an interview with The Wall Street Journal, but acknowledged Apple had struggled with its messaging around the tools.

The takeaway: Fallout from the CSAM features threatens to tarnish Apple’s privacy-first reputation and weaken consumer trust. Continued internal pushback to the features could also dampen morale among employees who agree with Apple’s traditionally firm pro-privacy philosophy.

You've read 0 of 2 free articles this month.

Create an account for uninterrupted access to select articles.
Create a Free Account