Matt Perault, Duke University Director, Center on Science & Technology Policy and Former Facebook Policy Director joins the Yahoo Finance Live panel to discuss social media companies silencing Pres. Trump.
AKIKO FUJITA: And let's talk about some of those moves we've seen from social media companies. Matt Perault is the Director at Duke University's Center on Science and Technology Policy. He's also the former policy director at Facebook.
Matt, it's great to have you on to discuss a number of moves that we saw develop over the weekend. You have been wary in the past of any move by lawmakers to restrict speech as you saw it. When you look at the moves that played out over the weekend, is that what this essentially amounts to if you're looking at something like Twitter and Facebook de-platforming the president?
MATT PERAULT: Mm-hmm. Well, this is distinct from legislators taking action against the platforms. This is platforms making their own decisions to remove the president's content from their platforms. And platforms have the legal right to do that.
They're protected by the First Amendment. They're protected against government intervention in terms of the government being able to determine the speech rules for their platforms. And they can make the decisions that they want to make about what speech to carry and what speech not to carry.
AKIKO FUJITA: Of course, there has been a lot of calls over the last four years for these platforms to at least step in when President Trump has made false claims or, you know, tweeted something that potentially could have incited violence. Now that we're seeing these platforms take action nine days out from the presidency-- end of the presidency, what do you think prompted this? And I'm curious if you think this is ultimately an admission by these companies that yes, in fact, they did play a role in the riots that played out last week and in growing the conversation that led up to it?
MATT PERAULT: I'm not sure I would frame it as an admission. I think from the platform's standpoint, this was a clear example of their platforms being used to create real-world harm. And from their perspective, that was always the dividing line. Speech would be permissible, real-world harm would be problematic.
From my perspective, I am concerned about overbroad speech restrictions. I think it's valuable when these platforms host a range of different types of speech, including speech that we really disagree with. And then rather than trying to restrict speech, I'd like to see the focus be on conduct, prosecution against people who have violated the law.
ZACK GUZMAN: I mean, on that front, like you said, I mean, in the case of Twitter and their expertise and their explanation there, they-- they acted and permanently removed President Trump's account because of real-world harm and the idea that that could happen yet again. So beyond that action, what do you see next? Is it just massive removal of accounts on these platforms?
Obviously, you can't play whack-a-mole in masks, or maybe you can. I mean, Facebook's talked about algorithms that have been built to maybe remove hateful speech and content like that. But it seems like it'd be rather difficult to do that, and I'm not sure regulators would be satisfied with that approach either.
MATT PERAULT: I think that's right. I think the question now is whether this is an anomaly and a one-off event based on a particular thing that happened in Washington with particular speech by one particular president, or whether this is a sign of what we have yet to come in terms of more aggressive speech restrictions by platforms. If there are more aggressive speech restrictions, I would see that as being quite problematic.
I disagree strongly with the president's speech, but a substantial portion of the country voted for him in the last election. I think it is speech that is widely-- they're views that are widely held by many people. And it's helpful for people to be able to see those views represented, even if we strongly disagree with them.
AKIKO FUJITA: As you know, Matt, when you look at the debate that's been playing out about regulation in Congress, it's sort of largely been divided in two sections here. One is about antitrust. The other is about Section 230.
When you look at the actions that were taken over the weekend, particularly by Amazon, that seems to point to a significant amount of power. Amazon, by saying that they're no longer going to host Parler, essentially shut down the platform. Does that reaffirm the case that lawmakers have made that these companies are just too powerful and need to be placed in check?
MATT PERAULT: I think there are many people who will raise those concerns. I think there are many people who will raise concerns about Section 230 related to this. I do think, as I said before, that it's fully within the platform's current legal rights to take the action that they took. My understanding is that after the AWS decision to-- to not offer services to Parler that Parler went and looked to get those services elsewhere.
And so that shows that AWS is not the only provider out there. There are other providers who could provide services. I do think there are questions, though, about what happens when platforms decide that widely-held political views are not entitled to be heard on their platforms. No matter how much we disagree with the particular views of individuals, we have strong First Amendment traditions in the country that suggest that we should punish conduct, as opposed to restricting speech.
AKIKO FUJITA: And a conversation that we've been having on the show here, well, Facebook and Twitter, and now Amazon, as well as Apple, these big tech names have really been in the crosshairs of the discussions. You know, the larger question here, how far do-- does the responsibility, the discussion about responsibility, how far should it extend?
Should it include some of these telecommunications companies, those who provide, essentially, the infrastructure that allows for this to be carried out? If you look at, like, an AT&T or, of course, our parent company Verizon that carries cable networks, that carry those like a Newsmax or an OAN-- ONA-- or OAN that have also pushed forward these false claims that have been elevated on social media, should they be held responsible too?
MATT PERAULT: I think that's exactly the right question. If you take the view that these views are so pernicious that they should not be heard on any communication platform, then you can't just look at social networks. You have to look at other components of the tech sector, such as-- such as cloud providers, for instance. And then you would also have to look at telecommunications providers, like AT&T and Verizon.
I thought it was very interesting that Senator Warner sent letters to a range of different communication providers, not just tech companies, but also telecommunications providers, asking them to preserve content on their platform that might help law enforcement look for the perpetrators of the events of last week. That suggests that he, at least, is considering the responsibility, not just the social media companies, but also telecoms networks.
I do think, as I've said before, that we should be-- that we should advocate for a regulatory landscape that supports a range of different perspectives and speech perspectives. And that would mean those speech perspectives on social networks, but also those speech on telecommunications providers. But if we are going to be more stringent in regulating them and censoring speech, then we should do it consistently across the providers.
ZACK GUZMAN: Yeah, that becomes a bit harder here, too, as we see all of these platforms grappling with a lot of the same issues there and the same issues that got Parler into the position it's now in, as well, grappling with the hosting issues here. But Matt Perault, Duke University Director, Center on Science and Technology Policy, as well as a former Facebook policy director, appreciate you coming on here to chat today.