Technology
Facebook internal investigation finds millions of members in QAnon groups
Facebook officially has a QAnon problem.
An internal investigation carried out by the social networking giant found that Facebook groups related to the far-right conspiracy theory QAnon are racking up millions of members on the platform.
According to documents provided to by a Facebook employee, the company’s investigation discovered thousands of QAnon pages and groups on the site. When combined, these groups and pages have an audience of more than 3 million members and followers. Ten of Facebook’s most popular QAnon groups alone make up more than 1 million of those members.
Facebook’s investigation also uncovered 185 ads “praising, supporting, or representing” the QAnon conspiracy that ran on the platform, NBC reports. The company made around $12,000 from the ads. In the last 30 days, those ads hit 4 million impressions on the platform.
The internal findings will be used to guide any policy decisions related to QAnon that Facebook may be working on, according to two company employees who anonymously spoke to NBC News.
Facebook could decide to treat QAnon as it treats other extremist content. The company outright white supremacy and white nationalism content from its platform in early 2019.
That same year it also rolled out rules for conspiratorial pages, specifically content. Facebook excludes anti-vaxxer accounts from its search results and recommendation engine, making the content harder to find. It also rejects advertising that promotes anti-vaccination messages.
Last week, Facebook one of its biggest QAnon groups, “Official Q/QAnon,” for repeatedly breaking its rules on misinformation, harassment, and hate speech. The group had nearly 200,000 members at the time of its removal.
Facebook took its first major action against QAnon content when it removed a network of groups, pages, and accounts about the conspiracy theory from its platform. However, that removal was due to the pages breaking Facebook’s policies on . Fake accounts were being set up by the network to promote its content.
While Facebook figures out what to do about QAnon, competing social media platforms have already taken action. Twitter in late July that it would block QAnon from appearing in its trends and recommendations sections and would remove links related to the conspiracy theory. TikTok followed shortly after by QAnon terms and content from its search feature
QAnon has been especially popular within the generation, making Facebook with its older demographics, the perfect place for the conspiracy to spread and grow to where it is today.
-
Entertainment6 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment6 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end
-
Entertainment5 days ago
‘The Room Next Door’ review: Tilda Swinton and Julianne Moore are magnificent
-
Entertainment4 days ago
‘The Wild Robot’ and ‘Flow’ are quietly revolutionary climate change films
-
Entertainment4 days ago
Mars is littered with junk. Historians want to save it.
-
Entertainment5 days ago
CES 2025 preview: What to expect
-
Entertainment3 days ago
Should you buy the 2024 Kindle Paperwhite Signature Edition?
-
Entertainment2 days ago
2024: A year of digital organizing from Palestine to X