Social media companies that are already deleting propaganda from foreign actors also need to erase false or misleading content that originates in the United States, rather than just pushing it to the margins of their platforms, a new report released Tuesday said.
“We urge them to add provably false information to the removal list,” said a report from New York University’s Stern Center for Business and Human Rights.
When it comes to politics, a majority of bad information shared on social media during the 2018 election did not come from Russian bots or trolls, but from U.S.-based websites such as 4Chan, Reddit, Gab, InfoWars, and Gateway Pundit, the Stern report said. The flotilla of memes and conspiracies then moves to the major platforms: Facebook, YouTube (which is owned by Google) and Twitter.
More broadly, social media posts spreading conspiracy theories that vaccines cause diseases in children, for example, would be a “prime example” of the kind of content that Facebook, YouTube and Twitter should consider removing from their platforms, said the report’s author, Paul M. Barrett, deputy director of the center.
The report’s release came the same day as a Senate committee hearing about the growing crisis of measles outbreaks in 11 states, caused by parents who refuse to vaccinate their children because of conspiracy theories about the drugs that spread on social media.
The increasing focus on domestic information comes two years after the 2016 election in which Russia’s government was found to have waged a misinformation war on the U.S. political system through social media. These media companies “are removing disinformation from Russia and other foreign countries,” Barrett wrote.
The report’s author acknowledges the concerns about apply similar standards to U.S. content. “Some commentators have argued that misleading content produced by U.S. citizens is difficult to distinguish from ordinary political communication protected by the First Amendment,” the report says.
But the social media giants are already putting their hand on the scale by virtue of the way their algorithms promote certain content, and by demoting some content and suspending accounts if users are judged to be engaging in harassment or hate speech.
For example, InfoWars founder Alex Jones was removed from all three major social media websites in the summer of 2018 following public pressure. Jones had long promoted patently untrue conspiracies, such as the idea that the 9/11 attacks and the Sandy Hook massacre in 2012 — where 20 children and six adults were murdered — were staged by the U.S. government.
But Barrett’s report notes that “significantly, the companies didn’t say they were punishing Jones for the hoaxes and lies central to his repertoire.”
Rather, Twitter suspended Jones for “abusive behavior,” and Facebook explicitly said their action against Jones did not have anything to do with “false news.” Facebook said they removed four of Jones’s pages that were “glorifying violence” and “using dehumanizing language to describe people who are transgender, Muslims and immigrants.” YouTube removed Jones’s pages for “hate speech,” as Apple did in removing Jones’s app from its store.
Beyond suspending InfoWars, the major tech companies have taken steps that are aimed at reducing disinformation.
YouTube recently said it will “begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the Earth is flat, or making blatantly false claims about historic events like 9/11.”
Twitter has a ban on fake accounts, and has removed material that was intended to mislead voters during elections for the purposes of voter suppression. And Google released a white paper earlier this month on “fighting disinformation” that centered around, among other measures, working to “counteract malicious actors seeking to spread disinformation.”
Barrett’s report for NYU’s Stern Center said this is not a sufficient standard, however.
“We contend that the platforms ought to take a harder line on domestic disinformation, which pollutes the marketplace of ideas,” Barrett wrote. “Content that’s provably untrue should be removed from social media sites, not merely demoted or annotated.”
Even with current measures that have been in place, Barrett’s report states, “Jones’ influence persists on Facebook by means of yet another channel: Infowars-themed ‘groups’ … where Jones’ fans gather to exchange news and conspiracy ideas.”
Barrett told Yahoo News that even groups like these might need to be shut down, or their content removed, if they spread provably false information.
The loudest complaints about de-platforming and suspensions thus far have come from conservatives, in part because some of the most notable public figures who have been removed from social media have been on the right, such as Milo Yiannopoulos.
But Twitter this week suspended the account of Rachael Swindon, an influential online supporter of Jeremy Corbyn, the leader of the opposition Labour party in Britain. Twitter said simply that Swindon’s account had violated its “abusive behavior policy.” The account was restored Tuesday.
Spiked, a Libertarian website, tweeted that it was hypocritical for those on the left to now “suddenly care about free speech.” Corbyn supporters, the website’s account said, “have called for right-wing tweeters to be banned time and again.”
Barrett’s report also recommends that the social media sites “retool algorithms to reduce the outrage factor,” and “establish more robust appeals processes.”
In an interview, Barrett acknowledged that if social media companies did remove more content, a more transparent and efficient appeals process would also be crucial.
“Could the social media platforms make mistakes? Could they go down a wrong path and begin taking off a type of material that arguably is valuable and at least is not provably false?” Barrett said. “For that reason … people need to have an opportunity to say that was taken down inappropriately.”
Barrett said that demonstrably false slanders against national politicians, suggesting without evidence that they are involved in child prostitution, are one of the best examples of content that still exists on sites like YouTube, and should be removed. That conspiracy, known as Pizzagate, almost ended in tragedy in 2016, when a North Carolina man entered a Washington, D.C., restaurant and fired rifle shots based on his belief in these wild rumors. No one was injured, and he was sentenced to four years in federal prison.
If the social media giants do not act more aggressively to remove false content, Barrett said, they are playing a “dangerous game” that ultimately could lead to these changes being mandated by government regulation.
“In a perverse way, all the controversy that swirls around Donald Trump may be insulating the social media companies just because Congress has a limited attention span,” Barrett said. “But at a certain point, you might well see lawmakers basically forced to move forward along these lines.”
Read more from Yahoo News: