Events & Resources

Learning Center
Read through guides, explore resource hubs, and sample our coverage.
Learn More
Events
Register for an upcoming webinar and track which industry events our analysts attend.
Learn More
Podcasts
Listen to our podcast, Behind the Numbers for the latest news and insights.
Learn More

About

Our Story
Learn more about our mission and how EMARKETER came to be.
Learn More
Our Clients
Key decision-makers share why they find EMARKETER so critical.
Learn More
Our People
Take a look into our corporate culture and view our open roles.
Join the Team
Our Methodology
Rigorous proprietary data vetting strips biases and produces superior insights.
Learn More
Newsroom
See our latest press releases, news articles or download our press kit.
Learn More
Contact Us
Speak to a member of our team to learn more about EMARKETER.
Contact Us

AI-driven underwriting is a vexing compliance problem for FIs

The news: GHS Federal Credit Union, a $230 million institution based in Binghamton, NY, selected Scienaptic AI’s credit decisioning platform to modernize its underwriting. The credit union plans to use AI models to pull in more data, automating more of the decisioning process.

How it works: Scienaptic provides loan decisioning software that plugs into financial institutions’ (FIs’) existing loan origination systems. The platform captures applicant data and tailors credit decisions to a lender’s risk parameters.

It also aggregates multiple data sources and uses proprietary machine-learning models trained on a large historical dataset to segment applicants and assign risk scores. It is designed to underwrite thin-file and new-to-credit borrowers more effectively than bureau-only models.

Zoom out: Biden-era Consumer Financial Protection Bureau commentary on AI underwriting suggests banks should be wary of model bias. Banks must also be mindful of the disparate impact caused by these biases, in which models trained on historical data replicate or amplify discrimination against protected groups.

FIs also face a black box problem, in which the opacity of AI models’ decision-making can make it hard to trace and report on how lending decisions are made. The Equal Credit Opportunity Act requires lenders give clear reasons for credit denials—but the law wasn’t designed for modern underwriting technology.

Our take: FIs face a modernization dilemma: They’re subject to underwriting compliance and reporting requirements that were designed for technology dating back to the 1970s. Many banks’ business processes are not far ahead. In the meantime, AI is dramatically changing how some banks operate, often in ways bankers and regulators may not understand.

FIs are stuck between meeting legal obligations and upgrading their technology. FIs’ technologists and risk and compliance professionals need a collaborative approach to adopting and managing AI models, as technology is moving ahead without an up-to-date understanding within banks or modern safeguards in place.

You've read 0 of 2 free articles this month.

Get more articles - create your free account today!
AI-driven underwriting is a compliance problem for banks