Is Big Tech About to Be Regulated?

In this article:

Sen. Josh Hawley, R-Mo., has proposed repealing rules that allow big technology companies to be exempt from liability when it comes to user-posted content. That would create major problems for Facebook (NASDAQ: FB), Google parent Alphabet -- which owns YouTube -- and other similar players. The change in rules would only impact companies with more than 30 million U.S. users, 300 million global users, or $500 million in annual revenue, but that would cover giants like Facebook, Alphabet, and Twitter. There would, however, be exceptions to the rule if a company can prove that its content-checking algorithm does not have a political bias.

To catch full episodes of all The Motley Fool's free podcasts, check out our podcast center. A full transcript follows the video.

More From The Motley Fool

This video was recorded on June 21, 2019.

Dylan Lewis: All right, Dan, this is a first segment for me, but I think one that we could bring back in the future. I'm going to call it "One Big Story for the Week." I think that this was something that I saw in my news feeds and was like, "Wow, this is actually something that may dramatically change the landscape of big tech." Do you want to introduce it?

Dan Kline: Right now, there's been a lot of talk about bias in social media. We won't get into the politics of that. You all know what we're talking about. Maybe one side or the other is being shut out more, not shown in algorithms, being banned. And because of that, a Republican senator, Josh Hawley from Missouri, has introduced a piece of legislation that would force the very largest tech companies -- really Facebook, Twitter, not too many others -- to vet everything that they post. Right now, they have an exception to the liability laws. If I go on Facebook, and I post something terrible about Dylan that's criminally libelous, it is not Facebook's fault. Now, they can police it after and take it down. But they don't have to police it beforehand. This proposed legislation would get rid of that exception to the rule and force them to police everything before it gets posted. I'm not even sure what that would look like or how it would work. What they would have to do is apply for an exception to the rule. And to get an exception, they would have to show they don't have a political bias. Again, without getting into the politics of this, I'm not sure how you show that you don't have something. It's very difficult to show that you don't not do something. Dylan, jump in and help me here!

Lewis: [laughs] Yeah, I think that sums it up, the fact that you don't not do something is hard. Basically, there would be this vetting process, where you are making your algorithms available to be audited, which is a very different approach to what we have seen with these companies in the past. You keyed up the fact that they have this kind of immunity right now, thanks to the way that a lot of laws are written. The carve-out in particular comes courtesy of Section 230 of the Communications Decency Act. Basically, this is something, like you said, saying that if third-party users are making comments on a platform, the company that provides that platform is not liable for whatever those comments might be. The logic there was basically, if you are trying to police all of this stuff in real time before things go live, it would be such an onerous process for a very large platform that it would effectively ruin the utility of the platform, because there'd be no way to keep up with all of the comments, posts, whatever, being put up there. You think about companies like Facebook, like Alphabet's YouTube, Twitter, this is really how they've made money for such a long time. It's all user-generated content. There are a lot of other applications beyond just social media sites that serve up ads that would probably be impacted by that. I'm thinking specifically here about Craigslist, the reviews that you see on Yelp and Amazon, etc. This is a far-reaching possibility.

Kline: And let's be clear, this is only for the largest companies. Thirty million U.S. users, 300 million global users, earn $500 million in annual revenue. So this will not affect the comment section on your local weekly. This is really targeting big tech.

There's a few problems with this. Are you comfortable with the government being able to figure out if a Facebook or a YouTube algorithm is biased? Now, think of every government website you've ever been on. It's basically like going to AOL in 1994. And we also have a government that's become very divided that maybe isn't capable of determining what's a bias or what's not a bias. I do think there's some very dangerous language in what's being talked about here.

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to its CEO, Mark Zuckerberg, is a member of The Motley Fool's board of directors. Daniel B. Kline has no position in any of the stocks mentioned. Dylan Lewis has no position in any of the stocks mentioned. The Motley Fool owns shares of and recommends Alphabet (A shares), Alphabet (C shares), Amazon, Facebook, and Twitter. The Motley Fool recommends Yelp. The Motley Fool has a disclosure policy.

Advertisement