Technology
Facebook will create a new ‘independent body’ to moderate content
Facebook spent much of Thursday attempting to quell the fire caused by the New York Times’ report about executive inaction and politicking on Facebook’s many crises over the past three years.
It also chose Thursday to publish its latest note from Mark Zuckerberg about some of the “toughest issues” Facebook is addressing. Thursday’s note, the second in this series, addressed “Content Governance and Enforcement.”
“Just as a free society will always have crime and our expectation of government is not to eliminate all crime but to effectively manage and reduce it, our community will also always face its share of abuse,” Zuckerberg wrote. “Our job is to keep the misuse low, consistently improve over time, and stay ahead of new threats.”
Zuckerberg’s first note, published in September, covered how Facebook was combatting election interference. The second note now includes Zuckerberg’s outline for challenges in content management; how Facebook will continuously define the content that is and is not allowed on Facebook, how it will police that content with both human reviewers and artificial intelligence, how it will mitigate the dissemination of what Zuckerberg calls inflammatory “borderline content,” Facebook’s approach to government regulation, and systems for “independent oversight and transparency.”
To that end, one of the most significant announcements was the creation of an “independent body” to review content decisions. Zuckerberg wrote:
In the next year, we’re planning to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding. The purpose of this body would be to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe.
The body will take some important content monitoring decisions out of the hands of Facebook. Zuckerberg wrote that Facebook is mulling over how to select members of the body, what their scope and processes will be, and more.
Zuckerberg told reporters on a call Thursday that “The basic approach is if you’re not happy after your appeal, then you can appeal to this board or higher body.”
The independent body is very much in the preliminary stages at this point, though Facebook hopes to have it established by the end of 2019.
Zuckerberg also shared some statistics, challenges, and other priorities of content reviewing. He wrote that content “nuances” account for a surprisingly high amount of what Facebook considers violating content.
“Today, depending on the type of content, our review teams make the wrong call in more than 1 out of every 10 cases,” Zuckerberg wrote. “It’s important to remember though that given the size of our community, even if we were able to reduce errors to 1 in 100, that would still be a very large number of mistakes.”
The note also addresses how Facebook will approach challenges algorithmic bias, the idea that sensationalist news is some of the content people engage the most with, proactive content removal, regulation, and more.
Facebook is clearly looking at this prismatic problem from many sides, and attempting to improve in earnest; it will even issue quarterly “transparency reports” now to show its progress on content reviewing.
Facebook likes to say it was “too slow” to address the problems it’s now grappling with, including how Russia set out to manipulate conversation on the social network in the lead-up to the 2016 election. The question is whether the tens of thousands of employee-strong initiatives Facebook is undertaking now is enough to eradicate the rot that it allowed to creep in in the first place.
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1453039084979896’);
if (window._geo == ‘GB’) {
fbq(‘init’, ‘322220058389212’);
}
if (window.mashKit) {
mashKit.gdpr.trackerFactory(function() {
fbq(‘track’, “PageView”);
}).render();
}
-
Entertainment6 days ago
OpenAI’s plan to make ChatGPT the ‘everything app’ has never been more clear
-
Entertainment5 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment6 days ago
How to watch NFL Christmas Gameday and Beyoncé halftime
-
Entertainment5 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end
-
Entertainment4 days ago
‘The Room Next Door’ review: Tilda Swinton and Julianne Moore are magnificent
-
Entertainment3 days ago
‘The Wild Robot’ and ‘Flow’ are quietly revolutionary climate change films
-
Entertainment4 days ago
CES 2025 preview: What to expect
-
Entertainment3 days ago
Mars is littered with junk. Historians want to save it.