Instagram is taking a hard line on bullying.
The social media network announced two new features on Monday designed to combat negative interactions on the popular app.
“Online bullying is a complex issue,” Instagram said in a press release. “For years now, we have used artificial intelligence to detect bullying and other types of harmful content in comments, photos and videos. As our community grows, so does our investment in technology. This is especially crucial for teens since they are less likely to report online bullying even when they are the ones who experience it the most.”
The first new feature, which Instagram said is “powered by AI,” notifies users when their comment may be considered offensive — before it’s even posted.
“This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification,” Instagram said. “From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect.”
The other feature, which is yet to be rolled out, is geared toward empowering Instagram users to “stand up to this kind of behavior.”
“We’ve heard from young people in our community that they’re reluctant to block, unfollow, or report their bully because it could escalate the situation, especially if they interact with their bully in real life,” Instagram said. “Some of these actions also make it difficult for a target to keep track of their bully’s behavior.”
Testing will soon begin on the feature, which is called Restrict. It allows people to “restrict” a user, which makes comments on Instagram only visible to the commenter.
“You can choose to make a restricted person’s comments visible to others by approving their comments,” Instagram explained. “Restricted people won’t be able to see when you’re active on Instagram or when you’ve read their direct messages.”