Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

US adults fear AI will erode news quality and jobs, survey finds

The news: A majority of US adults believe artificial intelligence will harm journalism and the quality of news, according to the Pew Research Center. Its 2024 study found widespread skepticism about AI’s role in the media.

Half of respondents expect the quality of news to decline over the next 20 years due to AI’s influence, reflecting deep concern over how automation may shape the future of information, trust, and editorial integrity.

Only a small fraction of respondents—10%—foresees any potential benefits from AI’s application in news production.

  • 59% of US adults predict that AI will lead to job losses within the journalism sector, while just 5% anticipate job growth.
  • Two out of three people (66%) are worried about AI’s potential to disseminate misinformation and generate hallucinations.
  • College-educated adults are more skeptical (56%) than those with less education (44%).

AI-written content and news summaries under scrutiny: 41% say AI-written articles are worse than human-authored ones. Only 19% believe they’re better, and 20% see no difference.

That’s just for content creation. AI’s use in summarizing news articles has been problematic. A 2025 BBC study of OpenAI’s GPT-4o, Microsoft Copilot Pro, Google Gemini Standard, and Perplexity found that 19% of AI-generated news summaries referencing BBC content included serious errors, including misquotes, editorial bias, and outdated information.

Our take: The concerns voiced in Pew’s research don’t signal the end of human-led journalism. Instead, they exemplify the growing need to balance AI’s efficiency with human expertise and oversight.

Audiences are wary of automation—especially when it touches trusted news sources. Media companies that disclose AI use and verify content can build trust and strengthen brand loyalty.

You've read 0 of 2 free articles this month.

Create an account for uninterrupted access to select articles.
Create a Free Account