Instagram is shoring up its defenses against abuse with the launch of tools people can use to restrict who reacts to their photos and videos. The photo-sharing company announced on Tuesday that it now supports the ability to restrict who comments, to remove people from private accounts, and to anonymously report those you think might be posting about injuring themselves.
The release of these new features are said to be part of Instagram’s efforts to make the service a “positive place for self-expression” — basically not have it be a troll-infested environment similar to Reddit and Twitter.
With the new commenting control tool, if you find that someone has posted a mean-spirited message on a photo or video you’ve shared, you don’t have to just sit there and take it. On top of the ability to filter out comments by keyword, you can now disable them from entire posts. This ability was only available for “a small number of accounts,” but “in a few weeks” everyone can use this. Prior to posting, tap the “advanced settings” option, choose “turn off commenting,” and that’s it. Comments can be re-enabled by toggling this option.
In addition, soon Instagram will allow its more than 500 million monthly active users to heart individual comments, not just the photo or video itself. It hopes that by bringing liking to the commenting level, it will “show support and encourages positivity throughout the community.”
For those of you with a private account, Instagram now lets you remove followers without blocking them. Why this hasn’t been done before now is unknown, but if you let someone see your posts, there previously wasn’t a way for you to get rid of them without resorting to blocking the account. Now, you can remove them by going into your list of followers and tapping the … menu next to their name. When this action is taken, no notification is sent to alert them that you don’t want to be friends.
Lastly is a feature social networks need to have in place, a way for people to let others know someone is threatening self-injury. Instagram chief executive Kevin Systrom explained, “From time to time, you may see friends struggling and in need of support.” So if you happen to see a post from someone that suggests self-harm, the service now lets you report it anonymously. There’s a team working 24 hours a day, 7 days a week around the world to review the reports and will connect that person with organizations that can help.
For many, Instagram has been a place where people share not only what they’re seeing, but also the art they’re creating. It has blown up into a community, but with more people on it, the danger of harassment and abuse increases. This may be especially so if you post content around politics, national and world issues, and social progress. We’ve seen such bad behavior on Twitter, Facebook, Reddit, and many other sites. Today’s release is an attempt from Instagram to head off any potential offenses. Instagram hasn’t been prominently in the news about harassment, unlike its counterparts, but with such a large audience, that could soon change.
I, for one, would appreciate a crackdown on trolls and spammers on Instagram. Nary a day goes by when I don’t post something whose hashtags spur “commenters” to post about how to get new followers or other accounts tag me in random comments. This results in me having to either ignore it or spend extra seconds to block and report the offender.
Systrom promised that these features are only the beginning, and more could be released in the future.