(Bloomberg) -- Banks that use complex models for lending better be able to explain why some borrowers are granted credit and others aren’t, regulators said Thursday.
Otherwise, one regulator at the Consumer Financial Protection Bureau warned, they’ll face an uphill climb when battling accusations of discriminatory lending. The subject has overshadowed the introduction of Goldman Sachs Group Inc.’s credit card for Apple Inc.
“If you’re facing those allegations, the first step is to understand why the algorithm in the decision-making process led to two different outcomes for apparently similarly situated applicants,” Albert Chang, counsel in the bureau’s innovation office, said Thursday at an industry conference in Manhattan. “And if you don’t have that ability to explain the decisions, it’s awfully hard to rebut that allegation of discrimination.”
Prominent Silicon Valley executives publicly complained this month about receiving significantly higher credit limits on their Goldman-backed Apple credit cards than their wives, despite having similar incomes and credit scores. That revived long-held concerns about algorithms illegally treating borrowers differently because of race, gender or other characteristics.
The controversy, which continues to play out on Twitter as more borrowers allege bias by Goldman, has dogged the storied Wall Street institution and marred its foray into banking for the masses. The New York State Department of Financial Services launched an investigation, and for the first time in years the bank has become the subject of nightly news broadcasts.
“There’s no gender bias in our process for extending credit,” Goldman Sachs Chief Executive Officer David Solomon told Bloomberg TV in an interview on Thursday from the New Economy Forum in Beijing.
Goldman spokesman Andrew Williams said the firm welcomes a discussion with policy makers and regulators. “For credit decisions we make, we can identify which factors from an individual’s credit bureau-issued credit report or stated income contribute to the outcome,” he said.
Firms have promoted the use of algorithms in lending decisions as an antidote to biased human underwriters. What’s more, lenders say, using complex models to process loan applications is quicker, cheaper and more efficient when pricing credit, and enables more people to borrow more money at better terms.
But regulators who enforce fair-lending laws continue to push banks to address the risk that complex models may cement the kind of bias they are meant to stamp out.
Carol Evans, an associate director in the Federal Reserve’s consumer and community affairs division, urged lenders to be more forthcoming with their overseers.
Bankers shouldn’t have the attitude of, “‘I know you don’t understand our models, but trust us,’” Evans said. “I don’t think that’s worked well for financial services in the past.”
(Updates with Goldman comment in seventh paragraph)
To contact the reporter on this story: Shahien Nasiripour in New York at email@example.com
To contact the editors responsible for this story: Michael J. Moore at firstname.lastname@example.org, Dan Reichl, Steve Dickson
For more articles like this, please visit us at bloomberg.com
©2019 Bloomberg L.P.