U.S. markets closed
  • S&P 500

    -40.15 (-1.21%)
  • Dow 30

    -157.51 (-0.59%)
  • Nasdaq

    -274.00 (-2.45%)
  • Russell 2000

    -23.10 (-1.48%)
  • Crude Oil

    -0.45 (-1.24%)
  • Gold

    +10.80 (+0.58%)
  • Silver

    +0.35 (+1.52%)

    -0.0037 (-0.31%)
  • 10-Yr Bond

    +0.0250 (+2.99%)

    +0.0030 (+0.23%)

    +0.0250 (+0.02%)

    +84.82 (+0.62%)
  • CMC Crypto 200

    +1.78 (+0.68%)
  • FTSE 100

    -4.48 (-0.08%)
  • Nikkei 225

    -354.81 (-1.52%)

Facebook will ban QAnon groups and pages from platform

Facebook has announced it will ban QAnon groups and pages from its platform. 'The Hype Machine' Author and MIT David Austin Professor of Management Sinan Aral joins the On the Move panel to discuss.

Video Transcript

JULIE HYMAN: There are, of course, some other things that can go wrong with these social networks that they themselves are trying to address. Most recently, Facebook said it's now going to be banning QAnon groups and pages. We have also seen some curbs in recent days put on President Trump's tweets, for example, when he said that the seasonal flu was more deadly than the coronavirus.

As we have seen this year progress-- or regress, depending on how you look at it-- have we seen the social networks get any better at this kind of stuff, this QAnon action, for example?

SINAN ARAL: Yes, but. And the "but" is it's small and it's late to the game, OK? I am a big advocate of free speech. But you can't yell "fire" in a crowded theater. You can't defame. You can't violate the 14th amendment with speech. And there's clearly speech-- incitements to violence and so on-- that cause harm. This needs to be moderated by the platforms.

They began moderating QAnon piecemeal, this specific speech or that specific speech. But QAnon adapted very quickly. They adopted palatable hashtags that got around the moderation. And so I think it's legitimate to take this much broader and wider action, although Facebook reports that it will take them some time to really clean it up. I do think it's important.

I do believe that there are legitimate ways to circumscribe free speech where it causes harm that are clear. And this is one that is clear. It's incitements to violence. QAnon is very much associated with those. In addition, it's clear defamation against known political candidates and/or others that are false information, clearly false information.

I think this is an easy one. This is one of the easier content moderations for Facebook. They should have done it a long time ago.

JULIA LA ROCHE: Sinan, it's Julia La Roche. And to build upon that, I would be interested to have you just kind of walk us through it very much human speak what that might look like for the platforms to regulate and moderate this kind of behavior? It seems like a pretty daunting challenge. How exactly can they do that?

SINAN ARAL: There's really three ways. And that's machine intelligence or machine learning, employees-- they've hired 35,000 at Facebook-- that's a human in the loop of the machine process, and crowd sourcing, getting individuals to flag content as harmful, as fake, as false, and so on. When you combine these three together, they can be effective as a scalable solution to content moderation.

Any one of them by themselves will face difficulty. Obviously, machine learning isn't perfect at solving this problem. Obviously, employees and moderators face a lot of stress in making difficult decisions. And there's not enough of them. And the crowd needs guidance and guardrails. The three of them together could work well to scale a content-moderation system that's much better than the one we have today.

JULIE HYMAN: Sinan, always good to get your perspective. Thank you. Sinan Aral is "The Hype Machine" author. And he's also the David Austen Professor of Management at MIT.