Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

A new kind of AI-generated attack could shift banking strategies

The news: Deepfakes pose a new threat to the financial sector as AI technology becomes more advanced, enabling bad actors to gain entry to banking systems with realistic audio and images, per the Wall Street Journal.

How we got here: Many banks have enhanced the customer experiences on their apps and websites by streamlining digital account access. That’s included enabling voice and facial recognition for a simpler, quicker login.

  • Meanwhile, AI has developed to the point where it enables bad actors to mimic users’ voices or images to gain entry into accounts.
  • Such incidents increased by 700% between 2022 and 2023, according to the Wall Street Journal. When a Wall Street Journal reporter experimented with an AI-generated version of herself, she was able to trick Chase’s system.

Preventing big losses: Beyond alienating their customers who become fraud victims, banks that don’t prioritize safeguards also risk substantial financial losses. 

  • American Banker recommends that banks update their know-your-customer processes with deepfake detection mechanisms and advanced technology. They must also raise customer and employee awareness of this risk.
  • Crowe recommends banks collaborate with financial industry trade associations, regulators, and law enforcement to develop standards, regulations, and strategies that help to combat this trend.

Key takeaways: Though bigger banks with larger cybersecurity budgets may be able to dedicate more resources toward deepfake incident prevention, FIs of all sizes remain at risk. 

  • Until FIs can ensure robust protections are in place, they should minimize potential pathways for these actors and require more complex verification for customer logins.

You've read 0 of 2 free articles this month.

Create an account for uninterrupted access to select articles.
Create a Free Account