Friday, April 26, 2024

Moderating explicit and illegal content on Facebook isn’t getting any easier

Share

Why it matters to you

Finding a way to deal with the explicit or illegal content on Facebook remaining an ongoing and constantly evolving issue.

Facebook has come a long way since its days of connecting college students to those in other dorms. More than a decade and a billion users later, the social network has become a powerful hub of content, but with that power comes great responsibility. And those who must bear the brunt of that responsibility are tasked with the rather onerous duty of evaluating potential cases of revenge pornography and “sextortion” — more than 50,000 times a month.

Per a leaked document first obtained by The Guardian, the social media platform ultimately disabled more than 14,000 accounts as a result of sexual abuse, with 33 of the cases involving children. While these may seem like gargantuan numbers, they could represent just the tip of the iceberg. The Guardian reports that because abusive content must be reported (and is not proactively sought out), the true extent of abuse on the platform could be far larger than even Facebook realizes.

Not only is scale an issue but in some sense, scope presents a problem as well. Moderators often have trouble following Facebook’s complex and sometimes ambiguous policies, with a source telling The Guardian, “Sexual policy is the one where moderators make most mistakes It is very complex.” But Facebook says that it is actively working to improve these processes. “We constantly review and improve our policies,” said Monika Bickert, ‎ head of global policy management at Facebook. “These are complex areas but we are determined to get it right.”

Facebook has come under fire in recent months for how it handles some of these “complex areas,” particularly with regard to child pornography. In March, the company came under fire after it failed to remove “dozens of images and pages devoted to apparent child pornography” flagged by the BBC. At the time, Facebook said that it reviewed the material in question and “removed all items that were illegal or against our standards.” The company added, “We take this matter extremely seriously and we continue to improve our reporting and take-down measures.”

But it’s still a dicey issue. Facebook’s manual on how to address various sexual abuse cases is no shorter than 65 slides long and simply cannot address the full breadth of potentially problematic content that may appear online.

“Not all disagreeable or disturbing content violates our community standards,” Facebook said. “For this reason we offer people who use Facebook the ability to customize and control what they see by unfollowing, blocking or hiding posts, people, pages and applications they don’t want to see.”

All the same, the social media platform says it is committed to “building better tools to keep our community safe,” noting, “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”




Read more

More News