Since the 2016 U.S. presidential election, Facebook has made a number of changes to restore user trust in type of posts they come across on the site. That includes deploying fact-checkers more widely to vet popular article links for accuracy, and giving users more tools to report information that they feel goes against Facebook’s community standards. Now, Facebook is giving Page managers more tools to help them keep track of which posts have been run afoul of moderators.
Starting tomorrow, Page managers will see a new “Page Quality” tab, where they can see which content Facebook removed for violating its rules against “hate speech, graphic violence, harassment and bullying, violence, regulated goods, nudity or sexual activity, and support or praise of people and events that are not allowed to be on Facebook.” They will have the ability to appeal content take-down decisions for 30 days. For now, Page managers won’t be able to see spam, clickbait, or IP violations in the Quality tab — it’s unclear why, but Facebook says it’s working on it.
Page managers will also be able to see which of their posts were rated as “false” or “having a false headline” by Facebook’s fact-checkers.
Facebook’s also making a change to its recidivism policy. Previously, if a number of Pages or Groups were managed by the same people or group of people, but only one of them was removed for violating Facebook’s Community Standards, all of those other Pages or Groups would stay up. Facebook would only remove Pages that were created for the express purpose of getting around the ban after the offending Page was removed.
Now, Facebook says that it may also start removing other Pages and Groups ‘even if that specific Page or Group has not met the threshold to be unpublished on its own.”
“To enforce this updated policy, we’ll look at a broad set of information, including whether the Page has the same people administering it, or has a similar name, to one we’re removing,” according to a Facebook blog post.
That policy change could have major implications as Facebook has proven more receptive to removing Pages that repeatedly violate its community standards — Pages that have often been set up as one of a large network. For example, Facebook has been criticized for keeping the Infowars store page up on the site after banning Alex Jones and Infowars over the summer. At the time, a Facebook spokesperson told the Washington Post that the Infowars store page hadn’t violated its community standards yet.