This week Facebook announced a new policy that will ban ads that discourage the use of vaccines, but the CEO of the Center for Countering Digital Hate Imran Ahmed says that new policy came a little too late. He joins Yahoo Finance's Alexis Christoforous and Brian Sozzi on The First Trade to discuss.
BRIAN SOZZI: This week Facebook announced a new policy that will ban ads that discourage the use of vaccines. But our next guest says that new policy came a little too late. The CEO of the Center for Countering Digital Hate joins us now, Imran Ahmed. Imran, good to see you on this Friday morning. So why do you think it's too late?
IMRAN AHMED: Well, good morning. We have been studying the spread of anti-vaccine propaganda on Facebook and Instagram for some time now, and found that there are tens of millions of followers of individuals, of groups, of pages in which anti-vax misinformation is spread. 38 million we found just on Facebook across the UK and the US, and that's been growing rapidly. So the organic reach of posts by concerted misinformation actors has been growing rapidly.
In fact, dealing with the ads problem, which was relatively small, it was small in terms of numbers, but it was extraordinary that there was any point at which Facebook had a business that was based on taking adverts from misinformation actors and spreading misinformation about vaccines into millions of newsfeeds. Let's leave that aside for a second. That is a tiny amount of the volume of mass misinformation out there. So what they've done, is essentially they've looked at a very, very dirty apartment and they've plumped up the cushions and not cleaned anything else.
ALEXIS CHRISTOFOROUS: OK, that's definitely, that analogy paints a picture for us. When you look at other platforms though, like Twitter and YouTube, are they doing a better job than Facebook when it comes to these kinds of ads and getting them off the site?
IMRAN AHMED: Well, Facebook are really, they're the 800-pound gorilla in the misinformation market. That's the truth. Twitter is, in technical terms, we would say that it's more about the tactical adjustment of what's on the agenda. Facebook is the company that can change the lens through which we see the world. And by repeated misinformation being spread to people, it can actually persuade people the world is a different way.
And we see this fracturing of realities being a problem not just when it comes to vaccines, but we saw it with coronavirus and people believing that masks were dangerous or that masks weren't necessary. We see it with identity-based hate, which is what my organization typically looks at. And in fact, it's Facebook that has the biggest problem and has done the least, ironically, to deal with that problem.
BRIAN SOZZI: Imran, would a breakup of Facebook, obviously you know Facebook is under attack now by the government, would a breakup help or hurt misinformation on the platform?
IMRAN AHMED: Well, competition always helps. Having people competing over not just the economics, but also the moral values of organizations, that is something that we've seen throughout different types of industries. And what you'll see, is people trying to gain relative advantage by showing that their platform is safe for people to go on.
I mean, fundamentally, Facebook has no competition as a way of mass distributing information and misinformation to people in a sustained way. YouTube kind of, but not really. Twitter is very immediate and transactional. Facebook is really the only player. And to have competition too, it would be a good thing. However, that's not just going to be enough. We are also going to need regulation, and we're going to need folks that turn around and say, well actually, you have standards that we expect you to keep to.
ALEXIS CHRISTOFOROUS: How can users tell if they are seeing false information? What can they do to protect themselves?
IMRAN AHMED: Well, very little, in reality. I mean, we talk a lot about giving people resilience and most of the books out there and articles out there about misinformation, they sort of, they describe the situation, the problem, which is now universally accepted. And it's a problem that will affect not just our politics, but our economics and everything else over the next few years and decades, unless we deal with it.
So there's very little that users individually can do, because they are being placed into a platform in which information and misinformation flow unabated, intermingling without an easy way to discern between the two. And worse still, misinformation, because of course, it's controversial, it's chewy, it's what people get engaged in, it's actually advantaged on the platform because it's more engaging.
And what we see on platforms now isn't a literal timeline. It is an algorithmically, an artificially generated list of the most engaging content. And that's great for keeping people on platform, it's great for winning in the attention economy. In an evening, when you think to yourself, what should I do? Should I spend half an hour trying to work out what to watch on Netflix and then choosing nothing, which is what most of my evenings feel like? Or is it go on Facebook and see what my chums are up to? That's what you end up doing. And of course, in that space, misinformation is being fed to you.
Even today, you can find anti-scientific information being dispersed on those platforms as adverts. So someone can pay to, for example, dissuade you about the science of climate change. Can pay to dissuade you about even now, can dissuade you about vaccines, certain types of vaccines, about coronavirus information. So these are very dangerous platforms. And we've known this for some time. Because of course, back in 2016 in the last presidential election cycle, we know the Russians were using that advert system to try and dissuade Americans from voting the way that they would have otherwise.