Advertisement
U.S. markets closed
  • S&P 500

    5,254.35
    +5.86 (+0.11%)
     
  • Dow 30

    39,807.37
    +47.29 (+0.12%)
     
  • Nasdaq

    16,379.46
    -20.06 (-0.12%)
     
  • Russell 2000

    2,125.72
    +11.37 (+0.54%)
     
  • Crude Oil

    82.98
    +1.63 (+2.00%)
     
  • Gold

    2,241.20
    +28.50 (+1.29%)
     
  • Silver

    24.99
    +0.23 (+0.94%)
     
  • EUR/USD

    1.0789
    -0.0041 (-0.38%)
     
  • 10-Yr Bond

    4.2060
    +0.0100 (+0.24%)
     
  • dólar/libra

    1.2619
    -0.0019 (-0.15%)
     
  • USD/JPY

    151.3950
    +0.1490 (+0.10%)
     
  • Bitcoin USD

    70,743.36
    +1,917.91 (+2.79%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • Nikkei 225

    40,168.07
    -594.66 (-1.46%)
     

Exclusive: Facebook exec says content moderation is 'never going to be perfect'

In an exclusive interview, top Facebook (FB) executives who oversee content said policing of posts on the platform will be impossible to get exactly right.

“It's never going to be perfect but at scale it is a remarkable achievement,” says John DeVine, vice president of global operations, who oversees the more than 15,000 content moderators who examine posts from the site’s 2.4 billion monthly active users worldwide.

Concerns about misinformation on Facebook reached a fever pitch after the 2016 presidential election, the outcome of which some have attributed to a Russian disinformation campaign on the platform. When revelations surfaced last year that Cambridge Analytica, a consulting firm hired by the Trump campaign, had harvested the data of 50 million Facebook users, distrust of the company worsened.

In an exclusive interview at Facebook’s Menlo Park headquarters, Yahoo Finance Editor-in-Chief Andy Serwer spoke with the three executives who oversee content at Facebook — DeVine; Monika Bickert, the head of global policy management; and Guy Rosen, the vice president of integrity.

The execs emphasized the strides the company has made in addressing misinformation and abusive content since the 2016 election.

“If you think about how we approach elections broadly, not just 2016, how we think about the work we do going forward, it's evolved a lot in the past few years,” Rosen says.

A man works at his desk in the war room, where Facebook monitors election related content on the platform, in Menlo Park, Calif., Wednesday, Oct. 17, 2018. (AP Photo/Jeff Chiu)
A man works at his desk in the war room, where Facebook monitors election related content on the platform, in Menlo Park, Calif., Wednesday, Oct. 17, 2018. (AP Photo/Jeff Chiu)

“It's going to be impossible to probably get it perfect,” DeVine adds. “But we know every day we're making progress against problems, against the adversarial behavior, and so we feel really good.”

Last month, Facebook announced that political ads in the U.S. will be required to display additional disclosures about who paid for the messages, after users put misleading labels on ads. Addressing false information about vaccines on the platform, Facebook launched a partnership last week with the World Health Organization (WHO) that will direct users searching for information on vaccines to the WHO’s website.

“There's always going to be continued challenges,” Rosen says. “And it is our responsibility to make sure that we are ahead of them and that we are anticipating what are the next kind of challenges that bad actors are going to try to spring on us.”

Yahoo Finance All Markets Summit
Yahoo Finance All Markets Summit

Andy Serwer is editor-in-chief of Yahoo Finance. Follow him on Twitter: @serwer.

Read more:

Facebook's Zuckerberg and Sandberg are this involved with the company's content issues

Exclusive: An in-depth look at Facebook's content police

Negative interest rates are coming and they are downright terrifying


Advertisement