Facebook today announced that it’s beginning to implement a slew of features aimed at helping users keep bullying and harassment out of their News Feed.

Most notably, Facebook will extend the option to ask for second review of content that has been reported for bullying or harassment. Previously, this option was only available for content that had been reported for violating Facebook’s nudity, sexual activity, hate speech, or graphic violence policies.

This means that if a user thinks their content was erroneously taken down for violating Facebook’s bullying and harassment policies, they can ask the company’s content moderators to reconsider. Additionally, users will also soon be able to request a second review if content they reported for bullying was allowed to stay up.

Facebook’s community standards policy states that the company will remove content “that purposefully targets private individuals with the intention of degrading or shaming them” and specifies that it gives more attention to bullying directed at minors, given that they are “more vulnerable and susceptible” to it.

Given that Facebook’s content moderators review an estimated 10 million pieces of content per week, mistakes are going to be made, and a second look policy is needed for more types of community standards violations. But such review adds to the already-Herculean task content moderators face of trying to decide whether or a post should stay up or be quickly taken down.

The company also announced that starting today users can now report someone for bullying a family member or friend. Before today, there were instances in which Facebook would only act on a piece of content if the person the bullying was directed at reported it.

Additionally, Facebook is rolling out a menu option on desktop and Android so that people can hide or delete multiple comments on a post at once. The option will also arrive on iOS in the coming months.

“We know our job is never done when it comes to keeping people safe, and we’ll continue listening to feedback on how we can build better tools and improve our policies,” wrote Facebook’s global head of safety, Antigone Davis.