U.S. Markets open in 2 hrs 2 mins

Apple Card’s Gender-Bias Claims Look Familiar to Old-School Banks

Shahien Nasiripour, Jenny Surane and Sridhar Natarajan
Apple Card’s Gender-Bias Claims Look Familiar to Old-School Banks

(Bloomberg) -- Apple Inc. pitches its new card as a model of simplicity and transparency, upending everything consumers think about credit cards.

But for the card’s overseers at Goldman Sachs Group Inc., it’s creating the same headaches that have bedeviled an industry the companies had hoped to disrupt.

Social media postings in recent days by a tech entrepreneur and Apple co-founder Steve Wozniak complaining about unequal treatment of their wives ignited a firestorm that’s engulfed the two giants of Silicon Valley and Wall Street, casting a pall over what the companies had claimed was the most successful launch of a credit card ever.

Goldman has said it’s done nothing wrong. There’s been no evidence that the bank, which decides who gets an Apple Card and how much they can borrow, intentionally discriminated against women. But that may be the point, according to critics. The complex models that guide its lending decisions may inadvertently produce results that disadvantage certain groups.

The problem -- in Washington it’s referred to as “disparate impact” -- is one the financial industry has spent years trying to address. The increasing use of algorithms in lending decisions has sharpened the years-long debate, as consumer advocates, armed with what they claim is supporting research, are pushing regulators and companies to rethink whether models are only entrenching discrimination that algorithm-driven lending is meant to stamp out.

“Because machines can treat similarly-situated people and objects differently, research is starting to reveal some troubling examples in which the reality of algorithmic decision-making falls short of our expectations, or is simply wrong,” Nicol Turner Lee, a fellow at the Center for Technology Innovation at the Brookings Institution, recently told Congress.

Wozniak and David Heinemeier Hansson said on Twitter that their wives were given significantly lower limits on their Apple Cards, despite sharing finances and filing joint tax returns. Wozniak said he and his wife report the same income and have a joint bank account, which should mean that lenders view them as equals.

One reason Goldman has become a poster child for the issue is that the Apple Card, unlike much of the industry, doesn’t let households share accounts. That could lead to family members getting significantly different credit limits. Goldman says it’s considering offering the option.

The bank said in a tweet it would also re-evaluate credit decisions if the borrowing limit is lower than the customer expected.

“We have not and never will make decisions based on factors like gender,” the company said. “In fact, we do not know your gender or marital status during the Apple Card application process.”

With this month’s snafu, Goldman has found itself in the middle of one of the thorniest laws in finance: the Equal Credit Opportunity Act. The 1974 law prohibits lenders from considering sex or marital status and was later expanded to prohibit discrimination based on other factors including race, color, religion, national origin and whether a borrower receives public assistance.

The issue gained national prominence in the 1970s when Jorie Lueloff Friedman, a prominent Chicago television anchor, began reporting on her own experience with losing access to some of her credit card accounts at local retailers after she married her husband, who was unemployed at the time. She ultimately testified before Congress, saying “in the eyes of a credit department, it seems, women cease to exist and become non-persons when they get married.”

FTC Warning

A 2016 study by credit reporting agency Experian found that women had higher credit scores, less debt, and a lower rate of late mortgage payments than men. Still, the Federal Trade Commission has warned that women may continue to face difficulties in getting credit.

Freddy Kelly, chief executive officer of Credit Kudos, a London-based credit scoring startup, pointed to the gender pay gap, where women are typically paid less than men for performing the same job, as one reason lenders may be stingy with how much they let women borrow.

Using complex algorithms that take into account hundreds of variables should lead to more just outcomes than relying on error-prone loan officers who may harbor biases against certain groups, proponents say.

“It’s hard for humans to manually identify these characteristics that would make someone more creditworthy,” said Paul Gu, co-founder of Upstart Network Inc., a tech firm that uses artificial intelligence to help banks make loans.

Upstart uses borrowers’ educational backgrounds to make lending decisions, which could run afoul of federal law. In 2017, the Consumer Financial Protection Bureau told the company it wouldn’t be penalized as part of an ongoing push to understand how lenders use non-traditional data for credit decisions.

AI Push

Consumer advocates reckon that outsourcing decision-making to computers could ultimately result in unfair lending practices, according to a June memorandum prepared by Democratic congressional aides working for the House Financial Services Committee. The memo cited studies that suggest algorithmic underwriting can result in discrimination, such as one that found black and Latino borrowers were charged more for home mortgages.

Linda Lacewell, the superintendent of the New York Department of Financial Services, which launched an investigation into Goldman’s credit card practices, described algorithms in a Bloomberg Television interview as a “black box.” Wozniak and Hansson said they struggled to get someone on the phone to explain the decision.

“Algorithms are not only nonpublic, they are actually treated as proprietary trade secrets by many companies,” Rohit Chopra, an FTC commissioner, said last month. “To make matters worse, machine learning means that algorithms can evolve in real time with no paper trail on the data, inputs, or equations used to develop a prediction.

“Victims of discriminatory algorithms seldom if ever know they have been victimized,” Chopra said.

(Updates with Goldman comments in ninth and 10th paragraphs.)

To contact the reporters on this story: Shahien Nasiripour in New York at snasiripour1@bloomberg.net;Jenny Surane in New York at jsurane4@bloomberg.net;Sridhar Natarajan in New York at snatarajan15@bloomberg.net

To contact the editors responsible for this story: Michael J. Moore at mmoore55@bloomberg.net, Steve Dickson, Daniel Taub

For more articles like this, please visit us at bloomberg.com

©2019 Bloomberg L.P.