U.S. markets closed
  • S&P 500

    +10.13 (+0.25%)
  • Dow 30

    +28.67 (+0.08%)
  • Nasdaq

    +109.30 (+0.95%)
  • Russell 2000

    +8.39 (+0.44%)
  • Crude Oil

    -1.63 (-2.01%)
  • Gold

    -2.80 (-0.14%)
  • Silver

    -0.30 (-1.23%)

    -0.0018 (-0.16%)
  • 10-Yr Bond

    +0.0250 (+0.72%)

    -0.0012 (-0.10%)

    -0.3530 (-0.27%)

    +58.48 (+0.26%)
  • CMC Crypto 200

    +9.65 (+1.87%)
  • FTSE 100

    +4.04 (+0.05%)
  • Nikkei 225

    +19.81 (+0.07%)

How a small design tweak reduced racial profiling on Nextdoor by 75%

Nextdoor is the wildly popular neighbourhood social network, which recently raised a new funding round valuing the company at more than $2bn. But four years ago, the company had a big problem.

The platform acts as an online bulletin board, letting members post appeals to find lost pets, search for a local plumber, or alert their neighbours to suspected crimes in the area.

Therein lay an issue. Users were calling out “suspicious” people hanging around their neighbourhoods, where the core reason for suspicion was the colour of their skin. “Nextdoor, the social network for neighbors, is becoming a home for racial profiling,” Splinter wrote in 2015.

Nextdoor consulted social psychology researchers for help.

Speaking on Yahoo Finance UK’s Global Change Agents with Lianna Brinded show, one of those researchers, Stanford University’s award-winning social psychology professor Jennifer Eberhardt, said people are more likely to act on bias when they feel threatened and believe they need to act quickly.

Watch the full Dr Jennifer Eberhardt Global Change Agents interview here

“[Nextdoor] realised they needed to slow people down to reduce the bias,” Eberhardt said.

Nextdoor outlined a clear definition of racial profiling in its community guidelines and prohibited it from the platform. The company also built a more comprehensive checklist for people describing suspicious activity.

“People have seen signs at the airport, ‘If you see something, say something,’” Eberhardt said. “They were trying to modify that, so it was, ‘If you see something suspicious, say something specific.’”

“It couldn’t be simply that they were a black man — it had to be a behaviour that they were reporting that was suspicious,” she said. “They also had to describe that person with enough detail that people would be able to identify them.”

The prompt that appears when Nextdoor users report suspicious activity in their neighbourhoods:

Nextdoor's crime and safety reporting form. Photo: Nextdoor
Nextdoor's crime and safety reporting form. Photo: Nextdoor

Much of Eberhardt’s research has also focused on how people find it easier to recognise different faces of their own race than those of other races, a phenomenon known as the “other race effect.”

“If all you’re doing is recognising people by their racial category, it’s kind of hard to distinguish one face from another once you are all placed in that one category — so they were trying to disrupt that,” Eberhardt said.

Nextdoor’s move to slow people down when reporting suspicious activity curbed instances of racial profiling on the platform by 75%, Eberhardt said.

A Nextdoor spokeswoman said fewer than 0.01% of posts contain racial profiling, but added, “as a community building platform, we do not tolerate racial profiling and feel strongly that even one incident is too many.”

The company has also recently begun testing a new “kindness reminders” feature, another way to encourage members to slow down and think about the content they are posting.

Nextdoor’s “kindness reminders” feature:

Nextdoor's "kindness reminders" feature. Photo: Nextdoor
Nextdoor's 'kindness reminders' feature. Photo: Nextdoor

Global Change Agents with Lianna Brinded explores the stories of some of the most inspirational women across business, tech, and academia. Catch up on all the latest episodes here.