Activists descended on Facebook’s Washington headquarters on Wednesday to demand the company take stronger action against vaccine falsehoods spreading on its platform, covering the area in front of Facebook’s office with body bags that read “disinfo kills”.
The day of protest, which comes as Covid cases surge in the US, has been organized by a group of scholars, advocates and activists calling themselves the “Real” Oversight Board. The group is urging Facebook’s shareholders to ban so-called misinformation “superspreaders” – the small number of accounts responsible for the majority of false and misleading content about the Covid-19 vaccines.
“People are making decisions based on the disinformation that’s being spread on Facebook,” said Shireen Mitchell, Member of the Real Facebook Oversight Board and founder of Stop Online Violence Against Women. “If Facebook is not going to take that down, or if all they’re going to do is put out disclaimers, then fundamentally Facebook is participating in these deaths as well.”
In coordination with the protest, the Real Oversight Board has released a new report analyzing the spread of anti-vaccine misinformation on Facebook during the company’s most recent financial quarter. The report and protest also come as Facebook announced its financial earnings for that same quarter, logging its fastest growth since 2016.
The report references a March study from the Center for Countering Digital Hate (CCDH) that found a small group of accounts – known as the “dirty dozen” – is responsible for more than 73% of anti-vaccine content across social media platforms, including Facebook. That report recently drew attention from the White House, and Joe Biden has condemned Facebook and other tech companies for failing to take action.
Facebook banned misinformation about vaccines from the platform in February of 2021, but critics say many posts slip through the platform’s filters and reach audiences of millions without being removed.
It also has introduced a number of rules relating to Covid-19 specifically, banning posts that question the severity of the disease, deny its existence, or argue that the vaccine has more risks than the virus. Still, the Real Oversight Board found that often such content has been able to remain on the platform and even make its way into the most-shared posts.
According to the Real Oversight Board’s report, a large share of the misinformation about the Covid vaccines comes from a few prolific accounts, and continues to be among the platform’s best performing and most widely shared content. It analyzed the top 10 posts on each weekday over the last quarter and found the majority of those originated from just five identified “superspreaders” of misinformation.
“When it comes to Covid disinformation, the vast majority of content comes from an extremely small group of highly visible users, making it far easier to combat it than Facebook admits,” the board said, concluding that Facebook is “continuing to profit from hate and deadly disinformation”.
The group has called on Facebook to remove the users from the platform or alter its algorithm to disable engagement with the offending accounts. A Facebook spokesman said the company disagrees with the statistic that 65% of vaccine misinformation comes from just 12 people.
“We permanently ban pages, groups, and accounts that repeatedly break our rules on Covid misinformation, and this includes more than a dozen pages, groups, and accounts from these individuals,” he said.
The spokesman added that Facebook has removed more than 18m pieces of Covid misinformation and flagged more than 167m pieces of information, connecting users to its Covid-19 information center.
“We remain the only company to partner with more than 80 fact-checking organizations covering over 60 languages, using AI to scale those factchecks against duplicate posts across our platform,” he said.
Congress has also taken note of the spread of vaccine misinformation on Facebook and other platforms, with the Democratic senator Amy Klobuchar introducing a bill that would target platforms whose algorithms promotes health misinformation related to an “existing public health emergency”.
The bill, called the Health Misinformation Act, would remove protections provided by the internet law Section 230, which prevent platforms from being sued over content posted by their users in such cases.
“For far too long, online platforms have not done enough to protect the health of Americans,” Klobuchar said in a statement on the bill. “These are some of the biggest, richest companies in the world, and they must do more to prevent the spread of deadly vaccine misinformation.”