In the third quarter of 2021 Facebook declares to have removed 13.6 million violent contenti from its platform. Thanks to the monitoring algorithms and moderator control, almost all of this content disappeared before it was reported by users. And Meta communicates that too on Instagram 3.3 million posts have been removed.
Millions of violent content removed from Facebook
In its Quarterly Report on Community Standards, Meta claims to have been able to manage a lot of violent content in a timely manner. “We removed 13.6 million content on Facebook that violated our violence and incitement to violence policy and proactively detected the 96.7% of these contents, before someone reported them to us “.
Proactive tracking has also worked on Instagram, where the company says “we removed 3.3 million pieces of content with a proactive detection rate of 96.4%. In the fourth quarter of 2021, there were 3 hate speech views for every 10,000 content views. On Instagram, the spread of hate speech was 0.02% “.
In addition to the expressly violent content, as many as 9.2 million perpetuated episodes of bullying e annoyance. In this case, proactivity comes to 59,4%. On Instagram, the same type of content reaches 7.8 million with an 83.7% proactivity rate. A situation that however remains complex to control. “The misuses of our products are not always the same and neither is the way we approach our ever-changing integrity work.”
Finally, the company that recently changed its name to Meta announces that monitoring of this type will play an important role in the future of the metaverse in the coming years.
Leave a Reply
View Comments