Supreme Court set to hear case on tech companies handling of sensitive content

Senior columnist RIck Newman outlines the case the Supreme Court is set to hear on how tech companies and social media platform algorithms handle and promote sensitive content.

Video Transcript

RACHELLE AKUFFO: The Supreme Court agreeing to hear a case surrounding big tech's responsibility to moderate content. Now the case centers around an allegation that YouTube helped aid and abet the killing of an American woman during the 2015 terrorist attacks in Paris. Now Finance's Rick Newman has been following the story. So Rick, what do you make of this and what it could mean for big tech companies like Google?

RICK NEWMAN: Well, it's an interesting twist on this-- the 1996 law called the Communications Decency Act, which has been upheld many times. I mean, what that law does is it basically makes internet platforms exempt from any damage that harmful content hosted on their platforms may cause. So if terrorists put something on Facebook, for example, Facebook is not responsible for what those terrorists put there.

But what the plaintiffs in this suit are saying is that in this case, YouTube did not just sort of host this content in a neutral way, not knowing it was there because their algorithms actually will push this content higher into feeds for people who express interests that might be related to these videos, things like recruitment videos that this terrorist group was posting on YouTube.

That because YouTube, even if they don't know they're doing it, that if they have algorithms that, in effect, are promoting this content-- in other words, it's not just sitting there where nobody can find it. They're actually helping people find it. Then maybe they do have criminal liability for the murder of this woman in France in 2015.

So, the Supreme Court is going to hear this. Like many cases in the Supreme Court, it will probably be months before we know what they're going to say about it. But it's an interesting new angle. And if they would find that there is some liability here for YouTube, there certainly would be other instances when Facebook, Twitter, and other social media networks might face the same problem because they use the same types of algorithms.

DAVE BRIGGS: I assume these social media giants are speaking with one voice. Largely speaking, what's been their response?

RICK NEWMAN: Well, I mean, I think they're sticking with the 1996 law and saying, we had no idea this content was there. If we find it and we find it to be harmful or criminal in some way, we take it down.

But what makes this interesting is that doesn't change the fact that if the content is there and nobody-- no people know it's there and the algorithm is promoting it, who is responsible for the algorithm? So are these platforms going to take responsibility for the robots, if you will, or for the software that is actually promoting this stuff a lot of the time without any human being aware of it?

SEANA SMITH: And Rick, we know members of Congress in recent years have tried to either reform or repeal Section 230, coming from both sides of the aisle. If the Supreme Court does, in fact, I guess, erode some of this immunity, talk about the potential fallout and what that could really look like here within the sector.

RICK NEWMAN: In order for anything to really change about Section 230 of the Communications Decency Act-- that's the part of the law that protects these companies when people put this unwanted content up there-- it's going to take a change in the law to actually do anything about that. That's putting aside this twist about the algorithms that might be promoting harmful content.

There's nothing the federal-- the executive branch can do. I mean, there have been tons of hearings on this. At the same time, big tech has really ramped up its lobbying presence in Washington, DC, and so far, to my mind, done a pretty good job of torpedoing any legislation that would change this part of the law.

If Congress ever did pass a change that would make social media companies liable for harmful content posted by their users, it would completely change the business model for a lot of what are now pretty big companies. I mean, Facebook is now a huge company. And many companies have some kind of social media presence, Google, obviously, being another one.

So I think that even though we hear people from both the Democratic Party and the Republican Party that say there are certain things they don't like about all the garbage that some of these sites post and host, there-- it's not obvious to me that there is any real momentum to change that by passing a new law.

RACHELLE AKUFFO: It certainly seems to be the case, despite all the scrutiny that we keep seeing. A big thank you to our very own Rick Newman. Have a good one, Rick.

Advertisement