As it celebrates its 10th anniversary, Instagram has begun rolling out a number of visual features including allowing you to change the app icon. With that, the Facebook-owned company is also introducing new measures to reduce abusive comments on the site.
The first feature is automatically blocking offensive comments using AI. According to Instagram, the AI will be able to detect these by using previously reported offensive comments. Users will simply tap on “View Hidden Comments” to see them. The social network will automatically get rid of all comments that violate its community guidelines.
This feature, that’s currently in beta mode, will come alongside a “nudge” functionality. This is meant to notify someone when other users are typing an abusive comment under their posts.
Earlier, Instagram used to prompt the offensive poster in hopes that they would rethink their intentions. Now, the app will also show warnings to the post’s owner.
Instagram has not revealed the exact figures but said that since rolling out the nudge feature in 2019, there has been a “meaningful decrease in negative interactions” in the comments section.
Moreover, the new tools will be available globally when people are writing comments in various languages. This includes English, Portuguese, Spanish, French, Russian, Chinese, and Arabic (Android only) to start. The site promised that it will expand these features to other languages in the future.
The platform is also joined by other rival sites including Twitter that has been introducing features and updating policies to reduce cyberbullying. Hopefully, we will see all these changes work for the greater good.