U.S. markets closed
  • S&P 500

    4,538.43
    -38.67 (-0.84%)
     
  • Dow 30

    34,580.08
    -59.71 (-0.17%)
     
  • Nasdaq

    15,085.47
    -295.85 (-1.92%)
     
  • Russell 2000

    2,159.31
    -47.02 (-2.13%)
     
  • Crude Oil

    66.22
    -0.28 (-0.42%)
     
  • Gold

    1,782.10
    +21.40 (+1.22%)
     
  • Silver

    22.45
    +0.17 (+0.76%)
     
  • EUR/USD

    1.1317
    +0.0012 (+0.10%)
     
  • 10-Yr Bond

    1.3430
    -0.1050 (-7.25%)
     
  • GBP/USD

    1.3235
    -0.0067 (-0.50%)
     
  • USD/JPY

    112.8000
    -0.4090 (-0.36%)
     
  • BTC-USD

    48,656.29
    +678.61 (+1.41%)
     
  • CMC Crypto 200

    1,367.14
    -74.62 (-5.18%)
     
  • FTSE 100

    7,122.32
    -6.89 (-0.10%)
     
  • Nikkei 225

    28,029.57
    +276.20 (+1.00%)
     

Facebook researchers were warning about its recommendations fueling QAnon in 2019

·Senior Editor
·2 min read

Facebook officials have long known about how the platform’s recommendations can lead users into conspiracy theory-addled “rabbit holes.” Now, we know just how clear that picture was thanks to documents provided by Facebook whistleblower Frances Haugen.

During the summer of 2019, a Facebook researcher found that it took just five days for the company to begin recommending QAnon groups and other disturbing content to a fictional account, according to an internal report whose findings were reported by NBC News, The Wall Street Journal and others Friday. The document, titled “Carol's Journey to QAnon” was also in a cache of records provided by Haugen to the Securities and Exchange Commission as part of her whistleblower complaint.

It reportedly describes how a Facebook researcher set up a brand new account for “Carol,” who was described as a “conservative mom.” After liking a few conservative, but “mainstream” pages, Facebook’s algorithms began suggesting more fringe and conspiracy content. Within five days of joining Facebook, “Carol” was seeing “groups with overt QAnon affiliations,” conspiracy theories about “white genocide” and other content described by the researcher as “extreme, conspiratorial, and graphic content.”

The fact that Facebook’s recommendations were fueling QAnon conspiracy theories and other concerning movements has been well known outside of the company for some time. Researchers and journalists have also documented the rise of the once fringe conspiracy theory during the coronavirus pandemic in 2020. But the documents show that Facebook’s researchers were raising the alarm about the conspiracy theory prior to the pandemic. The Wall Street Journal notes that researchers suggested measures like preventing or slowing down re-shared content but Facebook officials largely opted no to take those steps.

Facebook didn’t immediately respond to questions about the document. “We worked since 2016 to invest in people, technologies, policies and processes to ensure that we were ready, and began our planning for the 2020 election itself two years in advance,” Facebook’s VP of Integrity wrote in a lengthy statement Friday evening. In the statement, Rosen recapped the numerous measures he said Facebook took in the weeks and months leading up to the 2020 election — including banning QAnon and militia groups — but didn’t directly address the company’s recommendations prior to QAnon’s ban in October 2020.

The documents come at a precarious moment for Facebook. There have now been two whistleblowers who have turned over documents to the SEC saying the company has misled investors and prioritized growth and profits over users’ safety. Scrutiny is likely to further intensify as more than a dozen media organizations now have access to some of those documents.