Technology
Facebook won’t share the data needed to solve its far-right misinfo problem
It’s not exactly breaking news that far-right misinformation — better known to most as “lies” — tends to do well on Facebook. But it’s telling that the biggest takeaway from a new study that attempts to understand the phenomenon is that Facebook itself is our chief obstacle to understanding more.
New York University’s Cybersecurity for Democracy team released a paper on Wednesday bearing the title “Far-right sources on Facebook [are] more engaging.” The data isn’t terribly surprising if you’ve been paying any attention to the news of the past half-decade (and longer) and the role social media has played.
The report notes that content flowing out from sources rated by independent news rating services as far-right “consistently received the highest engagement per follower of any partisan group.” Repeat offenders are also rewarded, with “frequent purveyors of far-right misinformation” seeing significantly more engagement, by more than half, than other far-right sources.
Misinformation also exists on the far-left and in the political center — for the latter, primarily in the realm of not openly partisan health-focused websites — but it’s not received in the same way. In fact, the study found that these sources face a “misinformation penalty” for misleading their users, unlike right-leaning sources.
Again, none of this is terribly surprising. Facebook’s misinformation problem is well-documented and spans multiple areas of interest. The problem, as the study explicitly notes, is Facebook itself. Meaning the company that sets the rules, not the platform it built. Any attempts to better understand how information flows on the social network are going to suffer as long as Facebook doesn’t play ball.
The study spells out the issue explicitly:
Our findings are limited by the lack of data provided by Facebook, which makes public information about engagement — reactions, shares, and comments — but not impressions — how many people actually saw a piece of content, spent time reading it, and so on. Such information would help researchers better analyze why far-right content is more engaging. Further research is needed to determine to what extent Facebook algorithms feed into this trend, for example, and to conduct analysis across other popular platforms, such as YouTube, Twitter, and TikTok. Without greater transparency and access to data, such research questions are out of reach.
That chunk of text in particular makes the rest of the study a frustrating read. There are all of these data points signaling that something is deeply wrong on Facebook, with lies not only flourishing but being rewarded. But the company’s lack of transparency means we’re stuck with having to trust Facebook to do the right thing.
Not exactly an easy idea to trust, given the history. In fact, Facebook has already demonstrated — recently! — how it would prefer to keep third parties away from full-featured data analysis of user behavior on the social network.
In late October, just before Election Day, a report surfaced on the struggles faced by another NYU program in dealing with Facebook. The NYU Ad Observatory research project set out to look at how politicians were spending money and which voters they were targeting on the social network in the run-up to the election.
The project depended on a small army of volunteers, 6,500 of them, as well as a browser extension built to scrape certain kinds of data on the site. Facebook sent a letter threatening “additional enforcement action” if the project wasn’t shut down, with any collected data to be deleted. But that was before the news went public — Facebook ultimately relented and promised to take no action until “well after the election.”
The Ad Observatory incident doesn’t tie directly to this new misinformation study, but the parallels are clear enough. Facebook is fiercely protective of its hold on usage data — which, let’s be clear, is not the same thing as user data — and doesn’t seem to want any help fixing its own problems.
Whatever the reason for that may be internally, from the outside it looks an awful lot like Facebook is more focused on preserving its own interests, not public interests. Given the impact social media has had and continues to have on socio-political shifts in public sentiment, that possibility should alarm everyone.
-
Entertainment7 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment7 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment6 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment5 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment4 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?
-
Entertainment3 days ago
New teen video-viewing guidelines: What you should know