Social media moderation, censorship under review by SCOTUS

In this article:

The US Supreme Court will be hearing two major cases that may decide social media sites' ability to moderate and censor user content. Yahoo Finance Tech Editor Dan Howley joins Akiko Fujita to discuss the implications these decisions could have for advertisers and free speech on Big Tech platforms.

For more expert insight and the latest market action, click here to watch this full episode of Yahoo Finance Live.

Editor's note: This article was written by Luke Carberry Mogan.

Video Transcript

Akiko Fujita: The Big Tech is in the hot seat with the US Supreme Court hearing two cases this week weighing on whether state laws that seek to regulate Big Tech are a violation of the First Amendment. Let's bring in Yahoo Finance's Tech Editor, Dan Howley, who's tracking this story for us today. Dan, you know, obviously, we've got to hear out the case here. But what are the implications potentially for the way these platforms are run?

Dan Howley: Yeah. Akiko, this could have massive implications for not just the way organizations like social media run but the entire internet. So let's just break it down real quick. The court is hearing arguments over two laws, one in Florida, one in Texas. They both vary in certain ways. But what they really come down to is whether or not social media platforms are able to kind of editorialize or remove speech from their platforms, whether someone posts something that's quote, "lawful but awful," and they're then able to take it down or if they have to leave it up.

So the platforms have argued in the past that they should be able to take that kind of content down and others argue that they should allow it to remain up. Now, these kind of came about after former President, Trump, was banned from certain social media platforms following the attack on the Capitol on January 6th. And so this is kind of the continuation of those conversations.

And really what we're looking for is as the ultimate resolution could come down to companies have to allow all speech on their platforms, which could push away users and advertisers, or companies will have to severely moderate what's on their platform. So they wouldn't allow any kind of speech related to anything political almost. And so they may not be able to talk about elections or something like that.

They would take those down just to ensure they don't have to deal with any potential issues related to these cases. And so this is something that won't be decided until sometime around June, perhaps. But it wouldn't just apply to Facebook, Twitter, the like. It would apply to Google. It would apply to Wikipedia and Reddit. Anything that's out there really would potentially be impacted by this.

So it comes down to, is this kind of content going to remain online? Will it not or will the judges just decide that the status quo can continue as we've seen it?

Akiko Fujita: Dan, what have we heard from some of these social media companies ahead of these hearings?

Dan Howley: Yeah. A lot of them have pushed back on these laws. Anything that would require them to keep speech that they don't want online, would they say, go against their First Amendment rights. It would basically be forcing them to speak in a way that they don't want to or have content posted in the way that they don't want to.

Part of that has to do, again, with the users. Do the users want to see content that they find otherwise reprehensible or is misleading? If they don't and the platforms are forced to carry it, then they may dip out to another platform entirely. The same goes for their advertisers. So, you know, we've seen advertisers leave platforms, X, formerly known as Twitter, being one of the main ones when objectionable content shows up near adviteisers' advertisements.

And so if these laws are allowed to go forward and companies do have to allow anything on their platforms, well then advertisers will think twice about advertising there and seeing where content appears. If they're able to remove content, it's basically the status quo. But if they do in some instances in one of the laws, they'll have to provide an explanation as to why they remove the content. That can be automated, perhaps, but the platforms are saying it would be arduous for them to come up with ways of doing that.

Akiko Fujita: We'll be watching closely. Dan Howley, breaking that down for us. Thank you so much.

Advertisement