Facebook acknowledges its error: “We do things wrong with content moderation”
The company’s Chief Policy Officer explains in The Guardian that finding the balance on the platform is complicated
This week the British Guardian newspaper uncovered how Facebook reacted to sensitive content and how its moderators acted according to guidelines set by the Palo Alto Company.
Following the impact of the article, Monika Bickert, Head of Global Policy Management on Facebook, responded in the same newspaper assuming their mistakes, “but we take our role of security seriously,” he explains.
“On a typical day, over a billion people use Facebook, they share content in dozens of languages - all kinds of content, from photos and status updates to live videos,” says Bickert.
A quantity of content in which moderators have only 10 seconds to decide if a content is objectionable or did not publish last Sunday The Guardian .The leaked documents revealed that last summer, moderators faced more than 4,500 damage reports in two weeks, and this year’s statistics reported 5,400 in another two-week period.
The company, led by Mark Zuckerberg, announced in April that it would hire 3,000 new testers to tackle such content, amid incidents such as murder, torture or sexual assault that were broadcast on the social network and which saw hundreds of people.
Bickert acknowledges, “We do things the wrong way and we are constantly working to make sure that these cases happen less and less. We put a lot of effort into trying to find the right answers, even when there are none.”