As it can often be a gray area when it comes to what can and cannot be posted on Facebook, the social networking leader explained everything in a new update to its community standards page.
The update was made earlier today, as Facebook provided more insight into how and why it pulls certain posts, such as those that are related to hate speech, bullying, criminal activity, and sexually offensive content. The company also took time to explain why it doesn’t just ban posts pertaining to terrorism and organized crime, but also pulls pages of groups that support such activities.
That said, Facebook clarified that it isn’t changing anything with regards to how it polices user posts, but rather clarifying the guidance it provides to users in a way that is “consistent with how we’ve applied our standards in the past.” Users can still click on the “report” link to inform Facebook of offensive posts, after which the company would deliberate on whether to remove the offending post or not. Still, there will be some instances where content would be removed in specific countries, but not in others.
“People from different backgrounds may have different ideas about what’s appropriate to share — a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standard,” said Facebook, explaining why it may choose to remove content for residents of one country, but not for those located elsewhere.