Facebook is publishing the internal guidelines its moderators use to police the social network.
The rules have been shrouded in confusion and secrecy for years, though they have leaked before.
The new “Community Standards” are 8,500 words long and go into great detail about exactly what is and isn’t allowed — from sexual and violent content, to hate speech.
Facebook is also adding new procedures to file an appeal when its moderators remove a post.
Facebook is finally publishing the full internal guidelines its content moderators use to police the social network.
The move, announced Tuesday, offers significant new transparency around how the company manages its 2 billion users. Moreover, it comes paired with the announcement that Facebook will shake up the process for how it handles those cases where users report potentially objectionable content.See the rest of the story at Business Insider
There’s a little-known Snapchat feature that lets you stop getting notifications from individual friends — here’s how to use itA prominent tech investor says arrogance in Silicon Valley has reached a fever pitch and he’s constantly embarrassed by what people sayHow hidden trackers on websites use ‘login with Facebook’ to harvest your data
Read more: feedproxy.google.com